WorldWideScience

Sample records for flow analysis tool

  1. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    Science.gov (United States)

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  2. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  3. SiLK: A Tool Suite for Unsampled Network Flow Analysis at Scale

    Science.gov (United States)

    2014-06-01

    SiLK : A Tool Suite for Unsampled Network Flow Analysis at Scale Mark Thomas, Leigh Metcalf, Jonathan Spring, Paul Krystosek, Katherine Prevost netsa...make the problem manageable, but sampling unacceptably reduces the fidelity of ana- lytic conclusions. In this paper we discuss SiLK , a tool suite...created to analyze this high-volume data source without sampling. SiLK implementation and archi- tectural design are optimized to manage this Big Data

  4. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    Science.gov (United States)

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  5. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    Science.gov (United States)

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  6. Flow analysis techniques as effective tools for the improved environmental analysis of organic compounds expressed as total indices.

    Science.gov (United States)

    Maya, Fernando; Estela, José Manuel; Cerdà, Víctor

    2010-04-15

    The scope of this work is the accomplishment of an overview about the current state-of-the-art flow analysis techniques applied to the environmental determination of organic compounds expressed as total indices. Flow analysis techniques are proposed as effective tools for the quick obtention of preliminary chemical information about the occurrence of organic compounds on the environment prior to the use of more complex, time-consuming and expensive instrumental techniques. Recently improved flow-based methodologies for the determination of chemical oxygen demand, halogenated organic compounds and phenols are presented and discussed in detail. The aim of the present work is to demonstrate the highlight of flow-based techniques as vanguard tools on the determination of organic compounds in environmental water samples.

  7. Combining LCT tools for the optimization of an industrial process: material and energy flow analysis and best available techniques.

    Science.gov (United States)

    Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares

    2011-09-15

    Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows.

  8. Python tools for rapid development, calibration, and analysis of generalized groundwater-flow models

    Science.gov (United States)

    Starn, J. J.; Belitz, K.

    2014-12-01

    National-scale water-quality data sets for the United States have been available for several decades; however, groundwater models to interpret these data are available for only a small percentage of the country. Generalized models may be adequate to explain and project groundwater-quality trends at the national scale by using regional scale models (defined as watersheds at or between the HUC-6 and HUC-8 levels). Coast-to-coast data such as the National Hydrologic Dataset Plus (NHD+) make it possible to extract the basic building blocks for a model anywhere in the country. IPython notebooks have been developed to automate the creation of generalized groundwater-flow models from the NHD+. The notebook format allows rapid testing of methods for model creation, calibration, and analysis. Capabilities within the Python ecosystem greatly speed up the development and testing of algorithms. GeoPandas is used for very efficient geospatial processing. Raster processing includes the Geospatial Data Abstraction Library and image processing tools. Model creation is made possible through Flopy, a versatile input and output writer for several MODFLOW-based flow and transport model codes. Interpolation, integration, and map plotting included in the standard Python tool stack also are used, making the notebook a comprehensive platform within on to build and evaluate general models. Models with alternative boundary conditions, number of layers, and cell spacing can be tested against one another and evaluated by using water-quality data. Novel calibration criteria were developed by comparing modeled heads to land-surface and surface-water elevations. Information, such as predicted age distributions, can be extracted from general models and tested for its ability to explain water-quality trends. Groundwater ages then can be correlated with horizontal and vertical hydrologic position, a relation that can be used for statistical assessment of likely groundwater-quality conditions

  9. Material Flow Analysis as a Tool to improve Waste Management Systems: The Case of Austria.

    Science.gov (United States)

    Allesch, Astrid; Brunner, Paul H

    2017-01-03

    This paper demonstrates the power of material flow analysis (MFA) for designing waste management (WM) systems and for supporting decisions with regards to given environmental and resource goals. Based on a comprehensive case study of a nationwide WM-system, advantages and drawbacks of a mass balance approach are discussed. Using the software STAN, a material flow system comprising all relevant inputs, stocks and outputs of wastes, products, residues, and emissions is established and quantified. Material balances on the level of goods and selected substances (C, Cd, Cr, Cu, Fe, Hg, N, Ni, P, Pb, Zn) are developed to characterize this WM-system. The MFA results serve well as a base for further assessments. Based on given goals, stakeholders engaged in this study selected the following seven criteria for evaluating their WM-system: (i) waste input into the system, (ii) export of waste (iii) gaseous emissions from waste treatment plants, (iv) long-term gaseous and liquid emissions from landfills, (v) waste being recycled, (vi) waste for energy recovery, (vii) total waste landfilled. By scenario analysis, strengths and weaknesses of different measures were identified. The results reveal the benefits of a mass balance approach due to redundancy, data consistency, and transparency for optimization, design, and decision making in WM.

  10. Flow injection analysis as a tool for enhancing oceanographic nutrient measurements--a review.

    Science.gov (United States)

    Worsfold, Paul J; Clough, Robert; Lohan, Maeve C; Monbet, Philippe; Ellis, Peter S; Quétel, Christophe R; Floor, Geerke H; McKelvie, Ian D

    2013-11-25

    Macronutrient elements (C, N and P) and micronutrient elements (Fe, Co, Cu, Zn and Mn) are widely measured in their various physico-chemical forms in open ocean, shelf sea, coastal and estuarine waters. These measurements help to elucidate the biogeochemical cycling of these elements in marine waters and highlight the ecological and socio-economic importance of the oceans. Due to the dynamic nature of marine waters in terms of chemical, biological and physical processes, it is advantageous to make these measurements in situ and in this regard flow injection analysis (FIA) provides a suitable shipboard platform. This review, therefore, discusses the role of FIA in the determination of macro- and micro-nutrient elements, with an emphasis on manifold design and detection strategies for the reliable shipboard determination of specific nutrient species. The application of various FIA manifolds to oceanographic nutrient determinations is discussed, with an emphasis on sensitivity, selectivity, high throughput analysis and suitability for underway analysis and depth profiles. Strategies for enhancing sensitivity and minimizing matrix effects, e.g. refractive index (schlieren) effects and the important role of uncertainty budgets in underpinning method validation and data quality are discussed in some detail.

  11. A Model for Air Flow in Ventilated Cavities Implemented in a Tool for Whole-Building Hygrothermal Analysis

    DEFF Research Database (Denmark)

    Grau, Karl; Rode, Carsten

    2006-01-01

    A model for calculating air flows in ventilated cavities has been implemented in the whole-building hygrothermal simulation tool BSim. The tool is able to predict indoor humidity conditions using a transient model for the moisture conditions in the building envelope.......A model for calculating air flows in ventilated cavities has been implemented in the whole-building hygrothermal simulation tool BSim. The tool is able to predict indoor humidity conditions using a transient model for the moisture conditions in the building envelope....

  12. The analysis by low flux characteristics of the FARE TOOL and the flow characteristics improvement according to an optimizing design

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Han Hee [KAERI, Daejeon (Korea, Republic of); Lee, Sun Ki [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2005-07-01

    If a fare tool is to be used in the reactor for the purpose of nuclear fuel exchange, low flux signals will be sensed. so the performance test equipment of the fare is developed for the simulation of nuclear plant. the purpose of this study is to improve the design of FARE tool for the utilization in the korea nuclear plants. Especially, it was appeared that the advanced FARE tool has more improved the flow characteristics rather than those of the existing FARE tool. his study is from this case of changing a nuclear fuel, if a FARE tool is used, low flux signs are sensed.

  13. High Performance Flow Analysis and Control Tools for Aerial Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the project is to develop an open architecture, computer aided analysis and control design toolbox for distributed parameter systems, in particular,...

  14. Design of a Domain-Specific Language for Material Flow Analysis using Microsoft DSL tools: An Experience Paper

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2014-01-01

    Material Flow Analysis (MFA) is the procedure of measuring and assessing the mass flows of matter (solid waste, water, food...) and substances (carbon, phosphorus ...) within a process or a system for the period of time. In this paper we propose a Domain-Specific Language (DSL) to model MFA in a ...

  15. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  16. Flow Injection/Sequential Injection Analysis Systems: Potential Use as Tools for Rapid Liver Diseases Biomarker Study

    Directory of Open Access Journals (Sweden)

    Supaporn Kradtap Hartwell

    2012-01-01

    Full Text Available Flow injection/sequential injection analysis (FIA/SIA systems are suitable for carrying out automatic wet chemical/biochemical reactions with reduced volume and time consumption. Various parts of the system such as pump, valve, and reactor may be built or adapted from available materials. Therefore the systems can be at lower cost as compared to other instrumentation-based analysis systems. Their applications for determination of biomarkers for liver diseases have been demonstrated in various formats of operation but only a few and limited types of biomarkers have been used as model analytes. This paper summarizes these applications for different types of reactions as a guide for using flow-based systems in more biomarker and/or multibiomarker studies.

  17. Material flow analysis as a tool for sustainable sanitation planning in developing countries: case study of Arba Minch, Ethiopia.

    Science.gov (United States)

    Meinzinger, F; Kröger, K; Otterpohl, R

    2009-01-01

    Material Flow Analysis is a method that can be used to assess sanitation systems with regard to their environmental impacts. Modelling water and nutrients flows of the urban water, wastewater and waste system can highlight risks for environmental pollution and can help evaluating the potential for linking sanitation with resource recovery and agricultural production. This study presents the results of an analysis of nitrogen and phosphorus flows of Arba Minch town in South Ethiopia. The current situation is modelled and possible scenarios for upgrading the town's sanitation system are assessed. Two different scenarios for nutrient recovery are analysed. Scenario one includes co-composting municipal organic waste with faecal sludge from pit latrines and septic tanks as well as the use of compost in agriculture. The second scenario based on urine-diversion toilets includes application of urine as fertiliser and composting of faecal matter. In order to allow for variations in the rate of adoption, the model can simulate varying degrees of technology implementation. Thus, the impact of a step-wise or successive approach can be illustrated. The results show that significant amounts of plant nutrients can be provided by both options, co-composting and urine diversion.

  18. Culvert Analysis Program Graphical User Interface 1.0--A preprocessing and postprocessing tool for estimating flow through culvert

    Science.gov (United States)

    Bradley, D. Nathan

    2013-01-01

    The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output

  19. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  20. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  1. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  2. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  3. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  4. Graphical Multiprocessing Analysis Tool (GMAT)

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M.K.; Campbell, S.; Sikora, S.; Strout, R.; Zosel, M.

    1988-03-01

    The design and debugging of parallel programs is a difficult task due to the complex synchronization and data scoping issues involed. to aid the programmer in paralle code dvelopment we have developed two methodologies for the graphical display of execution of parallel codes. The Graphical Multiprocessing Analysis Tools (GMAT) consist of stategraph, which represents an inheritance tree of task states, and timeline, which represens task as flowing sequence of events. Information about the code can be displayed as the application runs (dynamic mode) or played back with time under user control (static mode). This document discusses the design and user interface issues involved in developing the parallel application display GMAT family. Also, we present an introductory user's guide for both tools. 4 figs.

  5. COMPUTER MODELING IN DEFORM-3D FOR ANALYSIS OF PLASTIC FLOW IN HIGH-SPEED HOT EXTRUSION OF BIMETALLIC FORMATIVE PARTS OF DIE TOOLING

    Directory of Open Access Journals (Sweden)

    I. V. Kachanov

    2015-01-01

    Full Text Available The modern development of industrial production is closely connected with the use of science-based and high technologies to ensure competitiveness of the manufactured products on the world market. There is also much tension around an energy- and resource saving problem which can be solved while introducing new technological processes and  creation of new materials that provide productivity increase through automation and improvement of tool life. Development and implementation of such technologies are rather often considered as time-consuming processes  which are connected with complex calculations and experimental investigations. Implementation of a simulation modelling for materials processing using modern software products serves an alternative to experimental and theoretical methods of research.The aim of this paper is to compare experimental results while obtaining bimetallic samples of a forming tool through the method of speed hot extrusion and the results obtained with the help of computer simulation using DEFORM-3D package and a finite element method. Comparative analysis of plastic flow of real and model samples has shown that the obtained models provide high-quality and reliable picture of plastic flow during high-speed hot extrusion. Modeling in DEFORM-3D make it possible to eliminate complex calculations and significantly reduce a number of experimental studies while developing new technological processes.

  6. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  7. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  8. Software Information Base(SIB)and Its Integration with Data Flow Diagram(DFD)Tool

    Institute of Scientific and Technical Information of China (English)

    董士海

    1989-01-01

    Software in formation base is the main technique of the integration of software engineering environment.Data flow diagram tool is an important software tool to support software requirement analysis phase.This article introduces the functions,structures of a Software Information Base(SIB),and a Data Flow Diagram tool first.The E-R data model of SIB and its integration with Data Flow Diagram tool are emphatically described.

  9. The Cash Flow as Financial Management Tool For Small Businesses

    Directory of Open Access Journals (Sweden)

    Osmar Siena

    2015-06-01

    Full Text Available This study is engaged on the axis of Financial Management, with research into the factors controlling corporation in small business finance. It has as main objective to analyze the cash flow tool as a tool for financial management and specific process to describe the use of the Cash Flow tool objectives; analyze the feasibility of implementing the Cash Flow tool as an instrument of financial management and suggest proposals for suitability for deployment of Cash Flows as a financial management system. Facing these objectives the research uses the precedence of qualitative methodology and applies the instruments on-site visit, interview and questionnaire to collect data. Descriptive analysis that confront the theoretical basis and the data obtained from research is used. With the completion of the analysis the following results were achieved: description of business processes researched; identifying the needs and forms of control currently used and presentation of improvement measures for the adjustment of non-conformities identified. The study contributes to both the academic improvement by analyzing the real situation of the company, as well as it serves as a recommendation to companies embracing similar difficulties in financial management.

  10. Signal flow analysis

    CERN Document Server

    Abrahams, J R; Hiller, N

    1965-01-01

    Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther

  11. INFLUENCE TOOL WEAR IN MATERIAL FLOW

    Directory of Open Access Journals (Sweden)

    Vladimíra Schindlerová

    2017-03-01

    Full Text Available The cold bulk forming is a technology that is commonly used in many industrial enterprises. Even though nowadays high demands are posed on labour productivity, quality and own production costs, the findings from practice suggests that not sufficient attention is paid to the issue of tool manage-ment. Also, the theoretical backgrounds and knowledge in this area are not processed in a sufficiently detailed and comprehensive way. This paper submitted to conference deals with the issue of predic-tion of the wear surface of the forming tools and their subsequent renewal. The research at selected materials was focused on the course of their straining in contact of the blank with the tool in the process of cold bulk forming. The experiments were based on a simple performance of the conven-tional upsetting test. On the basis of analysis of the results was determined the mechanism of tool wear by abrasion and was evaluated its impact on the service life time of the tool and also the possi-bility of influencing the quality of final parts.

  12. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  13. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  14. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  15. Udder Hygiene Analysis tool

    OpenAIRE

    2013-01-01

    In this report, the pilot of UHC is described. The main objective of the pilot is to make farmers more aware of how to increase udder health in dairy herds. This goes through changing management aspects related to hygiene. This report firstly provides general information about antibiotics and the processes that influence udder health. Secondly, six subjects are described related to udder health. Thirdly, the tools (checklists and roadmap) are shown and fourthly, advises that are written by UH...

  16. FAMUS (Flow Assurance by Management of Uncertainty and Simulation): a new tool for integrating flow assurance effects in traditional RAM (Reliability, Availability and Maintainability) analysis applied on a Norwegian Offshore System

    Energy Technology Data Exchange (ETDEWEB)

    Eisinger, Siegfried; Isaksen, Stefan; Grande, Oystein [Det Norske Veritas (DNV), Oslo (Norway); Chame, Luciana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Traditional RAM (Reliability, Availability and Maintainability) models fall short of taking flow assurance effects into account. In many Oil and Gas production systems, flow assurance issues like hydrate formation, wax deposition or particle erosion may cause a substantial amount of production upsets. Flow Assurance issues are complex and hard to quantify in a production forecast. However, without taking them into account the RAM model generally overestimates the predicted system production. This paper demonstrates the FAMUS concept, which is a method and a tool for integrating RAM and Flow Assurance into one model, providing a better foundation for decision support. FAMUS utilises therefore both Discrete Event and Thermo-Hydraulic Simulation. The method is currently applied as a decision support tool in an early phase of the development of an offshore oil field on the Norwegian continental shelf. (author)

  17. Social Data Analysis Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel

    2014-01-01

    that structure the inter-connections between these sites. Including contributions from a range of academic disciplines including Political Science, Media and Communication Studies, Economics, and Computer Science, this study showcases a new methodological approach that has been expressly designed to capture......As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extract......, analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...

  18. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  19. iFlow: A Graphical User Interface for Flow Cytometry Tools in Bioconductor

    Directory of Open Access Journals (Sweden)

    Kyongryun Lee

    2009-01-01

    Full Text Available Flow cytometry (FCM has become an important analysis technology in health care and medical research, but the large volume of data produced by modern high-throughput experiments has presented significant new challenges for computational analysis tools. The development of an FCM software suite in Bioconductor represents one approach to overcome these challenges. In the spirit of the R programming language (Tree Star Inc., “FlowJo”, these tools are predominantly console-driven, allowing for programmatic access and rapid development of novel algorithms. Using this software requires a solid understanding of programming concepts and of the R language. However, some of these tools|in particular the statistical graphics and novel analytical methods|are also useful for nonprogrammers. To this end, we have developed an open source, extensible graphical user interface (GUI iFlow, which sits on top of the Bioconductor backbone, enabling basic analyses by means of convenient graphical menus and wizards. We envision iFlow to be easily extensible in order to quickly integrate novel methodological developments.

  20. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  1. Sandia PUF Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  2. Animation tools for interactive flow visualization

    Energy Technology Data Exchange (ETDEWEB)

    Sethian, J.A. (Lawrence Berkeley Lab., CA (USA)); Salem, J.B.

    1989-01-01

    The authors have built a new graphics environment for essentially real-time visualization of the results of numerical simulations of fluid mechanics. Starting from a precomputed discrete set of time-dependent flow quantities, such as velocity and density, the user may interactively examine the data on a frame buffer using animated flow visualization diagnostics that mimic those in the experimental laboratory. The graphics environment accepts data written in a general format and can handle a wide variety of flow geometries. Images are updated at 9 frame/s, providing an effective way to study the solution, analyze fluid flow mechanisms, and compare numerical results with experiment. The authors have used this environment to analyze data produced from numerical simulations of viscous, incompressible, laminar, and turbulent flow. They have studied two-dimensional flow over a backward-facing step and in a closed square computed using Chorin's Random Vortex Method. The graphics environments can be used to isolate and identify a variety of physical flow phenomena, such as eddy formation and merger, propagation and decay, horseshoe vortices, mixing and intertwining of fluid structures, and pairing of counteroscillatory vortical structures.

  3. Rule-Based Multidisciplinary Tool for Unsteady Reacting Real-Fluid Flows Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A design and analysis computational tool is proposed for simulating unsteady reacting flows in combustor devices used in reusable launch vehicles. Key aspects...

  4. Performance Analysis using CPN Tools

    DEFF Research Database (Denmark)

    Wells, Lisa Marie

    2006-01-01

    This paper provides an overview of new facilities for performance analysis using Coloured Petri Nets and the tool CPN Tools. Coloured Petri Nets is a formal modeling language that is well suited for modeling and analyzing large and complex systems. The new facilities include support for collecting...... data during simulations, for generating different kinds of performance-related output, and for running multiple simulation replications. A simple example of a network protocol is used to illustrate the flexibility of the new facilities....

  5. Flow Injection Analysis

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    2004-01-01

    This chapter provides an introduction to automated chemical analysis, which essentially can be divided into two groups: batch assays, where the solution is stationary while the container is moved through a number of stations where various unit operations performed; and continuous-flow procedures,......, but it permits thr execution of novel and unique analytical procedures which are difficult or even impossible by conventional means. The performance and applicability of FIA, SI and LOV are illustrated by a series of practical examples.......This chapter provides an introduction to automated chemical analysis, which essentially can be divided into two groups: batch assays, where the solution is stationary while the container is moved through a number of stations where various unit operations performed; and continuous-flow procedures......, where the system is stationary while the solution moves through a set of conduits in which all required manipulations are performed. Emphasis is placed on flow injection analysis (FIA) and its further developments, that is, sequential injection analysis (SIA) and the Lab-on-Valve (LOV) approach. Since...

  6. Projectile Base Flow Analysis

    Science.gov (United States)

    2007-11-02

    S) AND ADDRESS(ES) DCW Industries, Inc. 5354 Palm Drive La Canada, CA 91011 8. PERFORMING ORGANIZATION...REPORT NUMBER DCW -38-R-05 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) U. S. Army Research Office...Turbulence Modeling for CFD, Second Edition, DCW Industries, Inc., La Cañada, CA. Wilcox, D. C. (2001), “Projectile Base Flow Analysis,” DCW

  7. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  8. Changes in the Metastability of the Midlatitude Southern Hemisphere Circulation and the Utility of Nonstationary Cluster Analysis and Split-Flow Blocking Indices as Diagnostic Tools

    Science.gov (United States)

    O'Kane, Terence; Risbey, James; Franzke, Christian; Horenko, Illia; Monselesan, Didier

    2013-04-01

    The authors examine changes in the metastability of the Southern Hemisphere 500-hPa circulation using both cluster analysis techniques and split-flow blocking indices. The cluster methodology is a purely datadriven approach for parameterization whereby a multiscale approximation to nonstationary dynamical processes is achieved through optimal sequences of locally stationary fast vector autoregressive factor (VARX) processes and some slow (or persistent) hidden process switching between them. Comparison is made with blocking indices commonly used in weather forecasting and climate analysis to identify dynamically relevant metastable regimes in the 500-hPa circulation in both reanalysis and Atmospheric Model Intercomparison Project (AMIP) model datasets. The analysis characterizes the metastable regime in both reanalysis and model datasets prior to 1978 as positive and negative phases of a hemispheric midlatitude blocking state with the southern annular mode (SAM) associated with a transition state. Post 1978, the SAM emerges as a true metastable state replacing the negative phase of the hemispheric blocking pattern. The hidden state frequency of occurrences exhibits strong trends. The blocking pattern dominates in the early 1980s then gradually decreases. There is a corresponding increase in the SAM frequency of occurrence. This trend is largely evident in the reanalysis summer and spring but was not evident in the AMIP dataset. Further comparison with the split-flow blocking indices reveals a superficial correspondence between the cluster hidden state frequency of occurrences and split-flow indices. Examination of composite states shows that the blocking indices capture splitting of the zonal flow whereas the cluster composites reflect coherent block AU1 formation. Differences in blocking climatologies from the respective methods are discussed.

  9. Modeling Tools Predict Flow in Fluid Dynamics

    Science.gov (United States)

    2010-01-01

    "Because rocket engines operate under extreme temperature and pressure, they present a unique challenge to designers who must test and simulate the technology. To this end, CRAFT Tech Inc., of Pipersville, Pennsylvania, won Small Business Innovation Research (SBIR) contracts from Marshall Space Flight Center to develop software to simulate cryogenic fluid flows and related phenomena. CRAFT Tech enhanced its CRUNCH CFD (computational fluid dynamics) software to simulate phenomena in various liquid propulsion components and systems. Today, both government and industry clients in the aerospace, utilities, and petrochemical industries use the software for analyzing existing systems as well as designing new ones."

  10. Physics Analysis Tools Workshop Report

    CERN Document Server

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  11. Monitoring nutrient flows and economic performance in African farming systems (NUTMON); II. tool development

    NARCIS (Netherlands)

    Bosch, van den H.; Jager, de A.; Vlaming, J.

    1998-01-01

    Farm-NUTMON is a research tool that integrates the assessment of stocks and flows of the macro-nutrients nitrogen, phosphorus and potassium on the one hand and economic farm analysis on the other. The tool is applicable at both the farm and the activity level. It includes a structured questionnaire,

  12. Monitoring nutrient flows and economic performance in African farming systems (NUTMON); II. tool development

    NARCIS (Netherlands)

    Bosch, van den H.; Jager, de A.; Vlaming, J.

    1998-01-01

    Farm-NUTMON is a research tool that integrates the assessment of stocks and flows of the macro-nutrients nitrogen, phosphorus and potassium on the one hand and economic farm analysis on the other. The tool is applicable at both the farm and the activity level. It includes a structured questionnaire,

  13. Influence of FSW pin tool geometry on plastic flow of AA7075 T651

    Science.gov (United States)

    Lertora, Enrico; Mandolfino, Chiara; Gambaro, Carla

    2016-10-01

    In this paper the behaviour of the plastic flow during Friction Stir Welding of AA7075 T651 plates, realized with different shaped tools, has been investigated. In particular, the influence of the shape of three tools was studied using copper strips placed along the welds. After welding, radiography and metallurgical analysis were used in order to investigate the marker movement and its fragmentation.

  14. Three-Phase Unbalanced Load Flow Tool for Distribution Networks

    DEFF Research Database (Denmark)

    Demirok, Erhan; Kjær, Søren Bækhøj; Sera, Dezso;

    2012-01-01

    This work develops a three-phase unbalanced load flow tool tailored for radial distribution networks based on Matlab®. The tool can be used to assess steady-state voltage variations, thermal limits of grid components and power losses in radial MV-LV networks with photovoltaic (PV) generators where...... most of the systems are single phase. New ancillary service such as static reactive power support by PV inverters can be also merged together with the load flow solution tool and thus, the impact of the various reactive power control strategies on the steady-state grid operation can be simply...... investigated. Performance of the load flow solution tool in the sense of resulting bus voltage magnitudes is compared and validated with IEEE 13-bus test feeder....

  15. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  16. A new imaging tool for realtime measurement of flow velocity in intracranial aneurysms

    Directory of Open Access Journals (Sweden)

    Athanasios K. Petridis

    2017-08-01

    Full Text Available With modern imaging modalities of the brain a significant number of unruptured aneurysms are detected. However, not every aneurysm is prone to rupture. Because treatment morbidity is about 10% it is crucial to identify unstable aneurysms for which treatment should be discussed. Recently, new imaging tools allow analysis of flow dynamics and wall stability have become available. It seems that they might provide additional data for better risk profiling. In this study we present a new imaging tool for analysis of flow dynamics, which calculates fluid velocity in an aneurysm (Phillips Electronics, N.V.. It may identify regions with high flow and calculate flow reduction after stenting of aneurysms. Contrast is injected with a stable injection speed of 2 mL/sec for 3 sec. Two clinical cases are illustrated. Velocity in aneurysms and areas of instability can be identified and calculated during angiography in real-time. After stenting and flow diverter deployment flow reduction in the internal carotid aneurysm was reduced by 60% and there was a reduction of about 65% in the posterior cerebral artery in the second case we are reporting. The dynamic flow software calculates the flow profile in the aneurysm immediately after contrast injection. It is a real-time, patient specific tool taking into account systole, diastole and flexibility of the vasculature. These factors are an improvement as compared to current models of computational flow dynamics. We think it is a highly efficient, user friendly tool. Further clinical studies are on their way.

  17. A New Imaging Tool for Realtime Measurement of Flow Velocity in Intracranial Aneurysms.

    Science.gov (United States)

    Petridis, Athanasios K; Kaschner, Marius; Cornelius, Jan F; Kamp, Marcel A; Tortora, Angelo; Steiger, Hans-Jakob; Turowski, Bernd

    2017-06-07

    With modern imaging modalities of the brain a significant number of unruptured aneurysms are detected. However, not every aneurysm is prone to rupture. Because treatment morbidity is about 10% it is crucial to identify unstable aneurysms for which treatment should be discussed. Recently, new imaging tools allow analysis of flow dynamics and wall stability have become available. It seems that they might provide additional data for better risk profiling. In this study we present a new imaging tool for analysis of flow dynamics, which calculates fluid velocity in an aneurysm (Phillips Electronics, N.V.). It may identify regions with high flow and calculate flow reduction after stenting of aneurysms. Contrast is injected with a stable injection speed of 2 mL/sec for 3 sec. Two clinical cases are illustrated. Velocity in aneurysms and areas of instability can be identified and calculated during angiography in real-time. After stenting and flow diverter deployment flow reduction in the internal carotid aneurysm was reduced by 60% and there was a reduction of about 65% in the posterior cerebral artery in the second case we are reporting. The dynamic flow software calculates the flow profile in the aneurysm immediately after contrast injection. It is a real-time, patient specific tool taking into account systole, diastole and flexibility of the vasculature. These factors are an improvement as compared to current models of computational flow dynamics. We think it is a highly efficient, user friendly tool. Further clinical studies are on their way.

  18. Dynamic Hurricane Data Analysis Tool

    Science.gov (United States)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  19. Laminar Flow Analysis

    Science.gov (United States)

    Rogers, David F.

    1992-10-01

    The major thrust of this book is to present a technique of analysis that aids the formulation, understanding, and solution of problems of viscous flow. The intent is to avoid providing a "canned" program to solve a problem, offering instead a way to recognize the underlying physical, mathematical, and modeling concepts inherent in the solutions. The reader must first choose a mathematical model and derive governing equations based on realistic assumptions, or become aware of the limitations and assumptions associated with existing models. An appropriate solution technique is then selected. The solution technique may be either analytical or numerical. Computer-aided analysis algorithms supplement the classical analyses. The book begins by deriving the Navier-Stokes equation for a viscous compressible variable property fluid. The second chapter considers exact solutions of the incompressible hydrodynamic boundary layer equations solved with and without mass transfer at the wall. Forced convection, free convection, and the compressible laminar boundary layer are discussed in the remaining chapters. The text unifies the various topics by tracing a logical progression from simple to complex governing differential equations and boundary conditions. Numerical, parametric, and directed analysis problems are included at the end of each chapter.

  20. Analysis of hydrogeological flow responses in Olkiluoto

    Energy Technology Data Exchange (ETDEWEB)

    Ahokas, H.; Rouhiainen, P.; Komulainen, J.; Poellaenen, J. [Poeyry Finland Oy, Vantaa (Finland)

    2014-04-15

    As part of the programme for the final disposal of spent nuclear fuel, an analysis of the flow responses caused by ONKALO leakages or other activities on the site has been compiled. Leakages into ONKALO or other activities, such as pumping in connection with groundwater sampling, cause changes in flow conditions in adjacent drillholes. Flows in open drillholes have been measured with the PFL-tool (PFL-DIFF), several times in some holes, as part of Olkiluoto Monitoring Programme (OMO) or in conjunction of interference test campaigns carried out in Olkiluoto. The main objective of the study is to analyse differences detected between flow measurements without pumping. PFL-measurements were started in 1997 and all the holes have been measured. In total, measurements have been repeated in 32 holes, which enables a study of possible changes. The development of interpretation methods to detect and quantify flow changes was an important part of this work. The determination of the exact flow response is a challenging task. Changes are caused in flow also by seasonal effects, which complicate an unambiguous analysis of the observed parameters. Overlapping activities (sinks) behind flow changes make the analysis difficult. In addition, the role of other open holes close to the observation hole can be significant. They may cause flow responses, which would not have been detected without their existence. Nevertheless, unambiguous flow responses caused by the pumping of a drillhole or leaking tunnels have been detected in scales from ca. 10 m to over 1 km. (orig.)

  1. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  2. Shot Planning and Analysis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  3. Flow Analysis: A Novel Approach For Classification.

    Science.gov (United States)

    Vakh, Christina; Falkova, Marina; Timofeeva, Irina; Moskvin, Alexey; Moskvin, Leonid; Bulatov, Andrey

    2016-09-01

    We suggest a novel approach for classification of flow analysis methods according to the conditions under which the mass transfer processes and chemical reactions take place in the flow mode: dispersion-convection flow methods and forced-convection flow methods. The first group includes continuous flow analysis, flow injection analysis, all injection analysis, sequential injection analysis, sequential injection chromatography, cross injection analysis, multi-commutated flow analysis, multi-syringe flow injection analysis, multi-pumping flow systems, loop flow analysis, and simultaneous injection effective mixing flow analysis. The second group includes segmented flow analysis, zone fluidics, flow batch analysis, sequential injection analysis with a mixing chamber, stepwise injection analysis, and multi-commutated stepwise injection analysis. The offered classification allows systematizing a large number of flow analysis methods. Recent developments and applications of dispersion-convection flow methods and forced-convection flow methods are presented.

  4. Cluster Flow: A user-friendly bioinformatics workflow tool [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Philip Ewels

    2016-12-01

    Full Text Available Pipeline tools are becoming increasingly important within the field of bioinformatics. Using a pipeline manager to manage and run workflows comprised of multiple tools reduces workload and makes analysis results more reproducible. Existing tools require significant work to install and get running, typically needing pipeline scripts to be written from scratch before running any analysis. We present Cluster Flow, a simple and flexible bioinformatics pipeline tool designed to be quick and easy to install. Cluster Flow comes with 40 modules for common NGS processing steps, ready to work out of the box. Pipelines are assembled using these modules with a simple syntax that can be easily modified as required. Core helper functions automate many common NGS procedures, making running pipelines simple. Cluster Flow is available with an GNU GPLv3 license on GitHub. Documentation, examples and an online demo are available at http://clusterflow.io.

  5. Computation of Internal Fluid Flows in Channels Using the CFD Software Tool FlowVision

    CERN Document Server

    Kochevsky, A N

    2004-01-01

    The article describes the CFD software tool FlowVision (OOO "Tesis", Moscow). The model equations used for this research are the set of Reynolds and continuity equations and equations of the standard k - e turbulence model. The aim of the paper was testing of FlowVision by comparing the computational results for a number of simple internal channel fluid flows with known experimental data. The test cases are non-swirling and swirling flows in pipes and diffusers, flows in stationary and rotating bends. Satisfactory correspondence of results was obtained both for flow patterns and respective quantitative values.

  6. OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems

    Science.gov (United States)

    Kao, David L.; Chan, William M.

    2012-01-01

    Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.

  7. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  8. Tools for THOR: Wave analysis

    Science.gov (United States)

    Narita, Yasuhito; Haaland, Stein; Vaivads, Andris

    2017-04-01

    The THOR mission goal is to reveal particle acceleration and heating mechanisms in turbulent space and astrophysical plasmas. Understanding the properties of waves and turbulent fluctuations plays a key role in revealing the acceleration and heating processes. An extensive set of field and particle experiments are developed and mounted on board the spacecraft. Correspondingly, many of the data analysis methods are being prepared, some as a heritage from the past and the current spacecraft missions and the others as new analysis methods to maximize the scientific potential of the THOR mission. It is worth noting that the THOR mission performs not only single-point measurements but also multi-point measurements by interferometric probe technique. We offer a set of analysis tools designed for the THOR mission: energy spectra, compressibility, ellipticity, wavevector direction, phase speed, Poynting vector, helicity quantities, wave distribution function, higher order statistics, wave-particle resonance parameter, and detection of pitch angle scattering. The emphasis is on the use of both the field data (electric and magnetic fields) and the particle data.

  9. Subcubic Control Flow Analysis Algorithms

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Van Horn, David

    We give the first direct subcubic algorithm for performing control flow analysis of higher-order functional programs. Despite the long held belief that inclusion-based flow analysis could not surpass the ``cubic bottleneck, '' we apply known set compression techniques to obtain an algorithm...... that runs in time O(n^3/log n) on a unit cost random-access memory model machine. Moreover, we refine the initial flow analysis into two more precise analyses incorporating notions of reachability. We give subcubic algorithms for these more precise analyses and relate them to an existing analysis from...

  10. Waste flow analysis and life cycle assessment of integrated waste management systems as planning tools: Application to optimise the system of the City of Bologna.

    Science.gov (United States)

    Tunesi, Simonetta; Baroni, Sergio; Boarini, Sandro

    2016-09-01

    The results of this case study are used to argue that waste management planning should follow a detailed process, adequately confronting the complexity of the waste management problems and the specificity of each urban area and of regional/national situations. To support the development or completion of integrated waste management systems, this article proposes a planning method based on: (1) the detailed analysis of waste flows and (2) the application of a life cycle assessment to compare alternative scenarios and optimise solutions. The evolution of the City of Bologna waste management system is used to show how this approach can be applied to assess which elements improve environmental performance. The assessment of the contribution of each waste management phase in the Bologna integrated waste management system has proven that the changes applied from 2013 to 2017 result in a significant improvement of the environmental performance mainly as a consequence of the optimised integration between materials and energy recovery: Global Warming Potential at 100 years (GWP100) diminishes from 21,949 to -11,169 t CO2-eq y(-1) and abiotic resources depletion from -403 to -520 t antimony-eq. y(-1) This study analyses at great detail the collection phase. Outcomes provide specific operational recommendations to policy makers, showing the: (a) relevance of the choice of the materials forming the bags for 'door to door' collection (for non-recycled low-density polyethylene bags 22 kg CO2-eq (tonne of waste)(-1)); (b) relatively low environmental impacts associated with underground tanks (3.9 kg CO2-eq (tonne of waste)(-1)); (c) relatively low impact of big street containers with respect to plastic bags (2.6 kg CO2-eq. (tonne of waste)(-1)).

  11. Information Flow Analysis for VHDL

    DEFF Research Database (Denmark)

    Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire...... information flow through the VHDL program. The result of the analysis is presented as a non-transitive directed graph that connects those nodes (representing either variables or signals) where an information flow might occur. We compare our approach to that of Kemmerer and conclude that our approach yields...

  12. Flow Injection Analysis

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    1998-01-01

    Learning objectives:* To provide an introduction to automated assays* To describe the basic principles of FIA * To demonstrate the capabilities of FIA in relation to batch assays and conventional continuous flow systems* To show that FIA allows one to augment existing analytical techniques* To sh...

  13. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  14. Design and analysis tool validation

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.

    1981-07-01

    The Solar Energy Research Institute (SERI) is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplified design tools and guidelines. The importance of the validity of the BEAS in predicting building energy performance is obvious when one considers the money and energy that could be wasted by energy-inefficient designs. However, to date, little or no systematic effort has been made to ensure the validity of the various BEAS. The validation work at SERI consists of three distinct parts: Comparative Study, Analytical Verification, and Empirical Validation. The procedures have been developed for the first two parts and have been implemented on a sampling of the major BEAS; results have shown major problems in one of the BEAS tested. Furthermore, when one building design was run using several of the BEAS, large differences were found in the predicted annual cooling and heating loads. The empirical validation procedure has been developed, and five two-zone test cells have been constructed for validation; a summer validation run will take place as soon as the data acquisition system is completed. Additionally, a test validation exercise is now in progress using the low-cal house to fine-tune the empirical validation procedure and better define monitoring data requirements.

  15. Water flow algorithm decision support tool for travelling salesman problem

    Science.gov (United States)

    Kamarudin, Anis Aklima; Othman, Zulaiha Ali; Sarim, Hafiz Mohd

    2016-08-01

    This paper discuss about the role of Decision Support Tool in Travelling Salesman Problem (TSP) for helping the researchers who doing research in same area will get the better result from the proposed algorithm. A study has been conducted and Rapid Application Development (RAD) model has been use as a methodology which includes requirement planning, user design, construction and cutover. Water Flow Algorithm (WFA) with initialization technique improvement is used as the proposed algorithm in this study for evaluating effectiveness against TSP cases. For DST evaluation will go through usability testing conducted on system use, quality of information, quality of interface and overall satisfaction. Evaluation is needed for determine whether this tool can assists user in making a decision to solve TSP problems with the proposed algorithm or not. Some statistical result shown the ability of this tool in term of helping researchers to conduct the experiments on the WFA with improvements TSP initialization.

  16. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  17. LFSTAT - Low-Flow Analysis in R

    Science.gov (United States)

    Koffler, Daniel; Laaha, Gregor

    2013-04-01

    The calculation of characteristic stream flow during dry conditions is a basic requirement for many problems in hydrology, ecohydrology and water resources management. As opposed to floods, a number of different indices are used to characterise low flows and streamflow droughts. Although these indices and methods of calculation have been well documented in the WMO Manual on Low-flow Estimation and Prediction [1], a comprehensive software was missing which enables a fast and standardized calculation of low flow statistics. We present the new software package lfstat to fill in this obvious gap. Our software package is based on the statistical open source software R, and expands it to analyse daily stream flow data records focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) provided for R which is based on tcl/tk. The functionality of lfstat includes estimation methods for low-flow indices, extreme value statistics, deficit characteristics, and additional graphical methods to control the computation of complex indices and to illustrate the data. Beside the basic low flow indices, the baseflow index and recession constants can be computed. For extreme value statistics, state-of-the-art methods for L-moment based local and regional frequency analysis (RFA) are available. The tools for deficit characteristics include various pooling and threshold selection methods to support the calculation of drought duration and deficit indices. The most common graphics for low flow analysis are available, and the plots can be modified according to the user preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, recession diagnostic, flow duration curves as well as double mass curves, and many more. From a technical point of view, the package uses a S3-class called lfobj (low-flow objects). This

  18. Plankton Analysis by Automated Submersible Imaging Flow Cytometry: Transforming a Specialized Research Instrument into a Broadly Accessible Tool and Extending its Target Size Range

    Science.gov (United States)

    2012-09-30

    the particles suspended in seawater is crucial to an understanding of the biology , optics, and geochemistry of the oceans. The composition and size...been interested in marine applications of flow cytometry, and had sold several slightly-modified instruments (called Influx Marina ) to oceanographic...our WHOI Biology Department colleague Don Anderson, who was funded by NSF to purchase several Environmental Sample Processors (ESP), to be

  19. Green chemistry and the evolution of flow analysis. A review.

    Science.gov (United States)

    Melchert, Wanessa R; Reis, Boaventura F; Rocha, Fábio R P

    2012-02-10

    Flow analysis has achieved its majority as a well-established tool to solve analytical problems. Evolution of flow-based approaches has been analyzed by diverse points of view, including historical aspects, the commutation concept and the impact on analytical methodologies. In this overview, the evolution of flow analysis towards green analytical chemistry is demonstrated by comparing classical procedures implemented with different flow approaches. The potential to minimize reagent consumption and waste generation and the ability to implement processes unreliable in batch to replace toxic chemicals are also emphasized. Successful applications of greener approaches in flow analysis are also discussed, focusing on the last 10 years.

  20. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  1. STRING 3: An Advanced Groundwater Flow Visualization Tool

    Science.gov (United States)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  2. General Mission Analysis Tool (GMAT) Mathematical Specifications

    Science.gov (United States)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  3. Computational fluid flow and heat transfer. An engineering tool

    Science.gov (United States)

    Salcudean, Martha

    1991-05-01

    The purpose, method, and potential of computational fluid dynamics (CFD) are discussed. Some examples of CFD and heat transfer applied to engineering problems are described. Simulation of casting in a permanent mold, gallium arsenide crystal growth, and the computation of discharge coefficients in film cooling of turbine blades are briefly described. It is shown the the CFD methods help to improve the understanding of the physics involved. They allow the influence of various parameters on the product or process to be investigated in a relatively inexpensive way. CFD constitutes a predictive tool which allows for product or process optimization. Discretization and solution methods used in the present examples are briefly described. Some limitations of the CFD methods are illustrated. The error introduced by false diffusion is shown for laminar flow around a bluff body. The improvement obtained by a higher order scheme is discussed. Some difficulties related to turbulence modelling are illustrated for the flow and heat transfer around the same bluff body. Turbulent swirling flow between concentric annuli is also discussed. Problems related to the slow convergence rate and major improvements obtained through applying multigrid convergence acceleration methods are shown for two and three dimensional opposing jets penetrating into a main flow.

  4. Flow Visualization of Horton Sphere using a CFD Tool

    Directory of Open Access Journals (Sweden)

    Lokesh Paradeshi

    2013-06-01

    Full Text Available Horton spheres are large vessels used in industries like petrochemical, fertilizer, refinery, chemical and power plant to store high pressure or liquefied gas and mix chemicals. Horton spheres are built by a special process. The process of welding induces residual stresses in the material near joints. In order to relieve these stresses, a stress relieving operation is conducted as per standard Flow visualization is one of the more important experimental tools for studying fluid flow and heat transfer. The computational work involves study of flow pattern at Reynolds number of 5.01x105. The grid independence study is carried out in structured mesh and optimum grid size is decided. Different turbulence models that include k- ε , k-ω and SST are tried out in an industry-standard code (CFX-10 and are compared with experimental results. A study of flow inside sphere is found to be useful in understanding and improving up on the stress-relieving operation of large vessels by optimizing the parameters involved

  5. Chromosome analysis and sorting using flow cytometry.

    Science.gov (United States)

    Doležel, Jaroslav; Kubaláková, Marie; Cíhalíková, Jarmila; Suchánková, Pavla; Simková, Hana

    2011-01-01

    Chromosome analysis and sorting using flow cytometry (flow cytogenetics) is an attractive tool for fractionating plant genomes to small parts. The reduction of complexity greatly simplifies genetics and genomics in plant species with large genomes. However, as flow cytometry requires liquid suspensions of particles, the lack of suitable protocols for preparation of solutions of intact chromosomes delayed the application of flow cytogenetics in plants. This chapter outlines a high-yielding procedure for preparation of solutions of intact mitotic chromosomes from root tips of young seedlings and for their analysis using flow cytometry and sorting. Root tips accumulated at metaphase are mildly fixed with formaldehyde, and solutions of intact chromosomes are prepared by mechanical homogenization. The advantages of the present approach include the use of seedlings, which are easy to handle, and the karyological stability of root meristems, which can be induced to high degree of metaphase synchrony. Chromosomes isolated according to this protocol have well-preserved morphology, withstand shearing forces during sorting, and their DNA is intact and suitable for a range of applications.

  6. Software Tool Integrating Data Flow Diagrams and Petri Nets

    Science.gov (United States)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  7. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  8. SHARAD Radargram Analysis Tool Development in JMARS

    Science.gov (United States)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  9. 2010 Solar Market Transformation Analysis and Tools

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  10. Quick Spacecraft Thermal Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  11. PCCF flow analysis -- DR Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Calkin, J.F.

    1961-04-26

    This report contains an analysis of PCCF tube flow and Panellit pressure relations at DR reactor. Supply curves are presented at front header pressures from 480 to 600 psig using cold water and the standard 0.236 inch orifice with taper down stream and the pigtail valve (plug or ball) open. Demand curves are presented for slug column lengths of 200 inches to 400 inches using 1.44 inch O.D. solid poison pieces (either Al or Pb-Cd) and cold water with a rear header pressure of 50 psig. Figure 1 is a graph of Panellit pressure vs. flow with the above supply and demand curves and clearly shows the effect of front header pressure and charge length on flow.

  12. FAAST: Flow-space Assisted Alignment Search Tool

    Directory of Open Access Journals (Sweden)

    Persson Bengt

    2011-07-01

    Full Text Available Abstract Background High throughput pyrosequencing (454 sequencing is the major sequencing platform for producing long read high throughput data. While most other sequencing techniques produce reading errors mainly comparable with substitutions, pyrosequencing produce errors mainly comparable with gaps. These errors are less efficiently detected by most conventional alignment programs and may produce inaccurate alignments. Results We suggest a novel algorithm for calculating the optimal local alignment which utilises flowpeak information in order to improve alignment accuracy. Flowpeak information can be retained from a 454 sequencing run through interpretation of the binary SFF-file format. This novel algorithm has been implemented in a program named FAAST (Flow-space Assisted Alignment Search Tool. Conclusions We present and discuss the results of simulations that show that FAAST, through the use of the novel algorithm, can gain several percentage points of accuracy compared to Smith-Waterman-Gotoh alignments, depending on the 454 data quality. Furthermore, through an efficient multi-thread aware implementation, FAAST is able to perform these high quality alignments at high speed. The tool is available at http://www.ifm.liu.se/bioinfo/

  13. Tools for Basic Statistical Analysis

    Science.gov (United States)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  14. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  15. Numerical Analysis for the Air Flow of Cross Flow Fan

    Science.gov (United States)

    Sakai, Hirokazu; Tokushge, Satoshi; Ishikawa, Masatoshi; Ishihara, Takuya

    There are many factors for designing the cross flow fan. Therefore, the performance of cross flow fan is not clear yet. We can analyze the transient flow of a cross flow fan using sliding mesh approach. One of the tasks using Computational Fluid Dynamics (CFD) is a way of modeling for analysis heat exchangers with cross flow fan. These tasks are very important for design. The paper has a modeling of heat exchangers and meshing the fan blades. The next tasks, we focus the ability of cross flow fan when we change the geometry of fan blades.

  16. Surface analysis of stone and bone tools

    Science.gov (United States)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  17. Stochastic Simulation Tool for Aerospace Structural Analysis

    Science.gov (United States)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  18. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  19. Proof-Carrying Code Based Tool for Secure Information Flow of Assembly Programs

    Directory of Open Access Journals (Sweden)

    Abdulrahman Muthana

    2009-01-01

    Full Text Available Problem statement: How a host (the code consumer can determine with certainty that a downloaded program received from untrusted source (the code producer will maintain the confidentiality of the data it manipulates and it is safe to install and execute. Approach: The approach adopted for verifying that a downloaded program will not leak confidential data to unauthorized parties was based on the concept of Proof-Carrying Code (PCC. A mobile program (in its assembly form was analyzed for information flow security based on the concept of proof-carrying code. The security policy was centered on a type system for analyzing information flows within assembly programs based on the notion of noninterference. Results: A verification tool for verifying assembly programs for information flow security was built. The tool certifies SPARC assembly programs for secure information flow by statically analyzing the program based on the idea of Proof-Carrying Code (PCC. The tool operated directly on the machine-code requiring only the inputs and outputs of the code annotated with security levels. The tool provided a windows user interface enabling the users to control the verification process. The proofs that untrusted program did not leak sensitive information were generated and checked on the host machine and if they are valid, then the untrusted program can be installed and executed safely. Conclusion: By basing proof-carrying code infrastructure on information flow analysis type-system, a sufficient assurance of protecting confidential data manipulated by the mobile program can be obtained. This assurance was come due to the fact that type systems provide a sufficient guarantee of protecting confidentiality.

  20. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  1. Built Environment Energy Analysis Tool Overview (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  2. UPIOM: a new tool of MFA and its application to the flow of iron and steel associated with car production.

    Science.gov (United States)

    Nakamura, Shinichiro; Kondo, Yasushi; Matsubae, Kazuyo; Nakajima, Kenichi; Nagasaka, Tetsuya

    2011-02-01

    Identification of the flow of materials and substances associated with a product system provides useful information for Life Cycle Analysis (LCA), and contributes to extending the scope of complementarity between LCA and Materials Flow Analysis/Substances Flow Analysis (MFA/SFA), the two major tools of industrial ecology. This paper proposes a new methodology based on input-output analysis for identifying the physical input-output flow of individual materials that is associated with the production of a unit of given product, the unit physical input-output by materials (UPIOM). While the Sankey diagram has been a standard tool for the visualization of MFA/SFA, with an increase in the complexity of the flows under consideration, which will be the case when economy-wide intersectoral flows of materials are involved, the Sankey diagram may become too complex for effective visualization. An alternative way to visually represent material flows is proposed which makes use of triangulation of the flow matrix based on degrees of fabrication. The proposed methodology is applied to the flow of pig iron and iron and steel scrap that are associated with the production of a passenger car in Japan. Its usefulness to identify a specific MFA pattern from the original IO table is demonstrated.

  3. Performance analysis of GYRO: a tool evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6016 (United States); Candy, J [General Atomics, PO Box 85608, San Diego, CA 92186-5608 (United States); Carrington, L [San Diego Supercomputer Center, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Huck, K [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Kaiser, T [San Diego Supercomputer Center, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Mahinthakumar, G [Department of Civil Engineering, North Carolina State University, Raleigh, NC 27695-7908 (United States); Malony, A [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Moore, S [Innovative Computing Laboratory, University of Tennessee, 1122 Volunteer Blvd., Suite 413, Knoxville, TN 37996-3450 (United States); Reed, D [Renaissance Computing Institute, University of North Carolina at Chapel Hill, CB 7583, Carr Building, Chapel Hill, NC 27599-7583 (United States); Roth, P [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831-6016 (United States); Shan, H [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Shende, S [Computer and Information Science Department, 1202 University of Oregon, Eugene, OR 97403-1202 (United States); Snavely, A [San Diego Supercomputer Center, Univ. of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0505 (United States); Sreepathi, S [Dept. of Computer Science, North Carolina State Univ., Raleigh, NC 27695-7908 (United States); Wolf, F [Innovative Computing Lab., Univ. of Tennessee, 1122 Volunteer Blvd., Suite 413, Knoxville, TN 37996-3450 (United States); Zhang, Y [Renaissance Computing Inst., Univ. of North Carolina at Chapel Hill, CB 7583, Carr Building, Chapel Hill, NC 27599-7583 (United States)

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  4. Performance Analysis of GYRO: A Tool Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  5. A GIS-based Computational Tool for Multidimensional Flow Velocity by Acoustic Doppler Current Profilers

    Science.gov (United States)

    Kim, D.; Winkler, M.; Muste, M.

    2015-06-01

    Acoustic Doppler Current Profilers (ADCPs) provide efficient and reliable flow measurements compared to other tools for characteristics of the riverine environments. In addition to originally targeted discharge measurements, ADCPs are increasingly utilized to assess river flow characteristics. The newly developed VMS (Velocity Mapping Software) aims at providing an efficient process for quality assurance, mapping velocity vectors for visualization and facilitating comparison with physical and numerical model results. VMS was designed to provide efficient and smooth work flows for processing groups of transects. The software allows the user to select group of files and subsequently to conduct statistical and graphical quality assurance on the files as a group or individually as appropriate. VMS also enables spatial averaging in horizontal and vertical plane for ADCP data in a single or multiple transects over the same or consecutive cross sections. The analysis results are displayed in numerical and graphical formats.

  6. Analysis and visualization of complex unsteady three-dimensional flows

    Science.gov (United States)

    Van Dalsem, William R.; Buning, Pieter G.; Dougherty, F. Carroll; Smith, Merritt H.

    1989-01-01

    Flow field animation is the natural choice as a tool in the analysis of the numerical simulations of complex unsteady three-dimensional flows. The PLOT4D extension of the widely used PLOT3D code to allow the interactive animation of a broad range of flow variables was developed and is presented. To allow direct comparison with unsteady experimental smoke and dye flow visualization, the code STREAKER was developed to produce time accurate streaklines. Considerations regarding the development of PLOT4D and STREAKER, and example results are presented.

  7. Tools for Life Support Systems Analysis

    Science.gov (United States)

    Lange, K.; Ewert, M.

    An analysis of the optimum level of closure of a life support system is a complex task involving hundreds, if not thousands, of parameters. In the absence of complete data on candidate technologies and a complete definition of the mission architecture and requirements, many assumptions are necessary. Because of the large number of parameters, it is difficult to fully comprehend and compare studies performed by different analysts. The Systems Integration, Modeling, and Analysis (SIMA) Project Element within NASA's Advanced Life Support (ALS) Project has taken measures to improve this situation by issuing documents that define ALS requirements, baseline assumptions, and reference missions. As a further step to capture and retain available knowledge and to facilitate system-level studies, various software tools are being developed. These include a database tool for storing, organizing, and updating technology parameters, modeling tools for evaluating time-average and dynamic system performance, and sizing tools for estimating overall system mass, volume, power, cooling, logistics, and crew time. This presentation describes ongoing work on the development and integration of these tools for life support systems analysis.

  8. General Analysis Tool Box for Controlled Perturbation

    CERN Document Server

    Osbild, Ralf

    2012-01-01

    The implementation of reliable and efficient geometric algorithms is a challenging task. The reason is the following conflict: On the one hand, computing with rounded arithmetic may question the reliability of programs while, on the other hand, computing with exact arithmetic may be too expensive and hence inefficient. One solution is the implementation of controlled perturbation algorithms which combine the speed of floating-point arithmetic with a protection mechanism that guarantees reliability, nonetheless. This paper is concerned with the performance analysis of controlled perturbation algorithms in theory. We answer this question with the presentation of a general analysis tool box. This tool box is separated into independent components which are presented individually with their interfaces. This way, the tool box supports alternative approaches for the derivation of the most crucial bounds. We present three approaches for this task. Furthermore, we have thoroughly reworked the concept of controlled per...

  9. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  10. Accelerator physics analysis with interactive tools

    Energy Technology Data Exchange (ETDEWEB)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties.

  11. Increasing Machine Tool Productivity with High Pressure Cryogenic Coolant Flow

    Science.gov (United States)

    1992-05-01

    tool change time and cost for the three strategies. " Item 4 shows the Tool/Machine Cost Ratios ( TIM Ratio) for the various strategies and tool...0.; and Tharp , M. J. Chip breaking on uranium alloy. Oak Ridge Y-12 Plant, Union Carbide Corp., Oak Ridge, TN. September 1987, p. 16. 25. Ringler, A

  12. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  13. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  14. Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    Science.gov (United States)

    Huang, C. Y.; Tworzydlo, W.; Oden, J. T.; Bass, J. M.; Cullen, C.; Vadaketh, S.

    1991-01-01

    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described.

  15. Modular Control Flow Analysis for Libraries

    DEFF Research Database (Denmark)

    Probst, Christian W.

    2002-01-01

    One problem in analyzing object oriented languages is that the exact control flow graph is not known statically due to dynamic dispatching. However, this is needed in order to apply the large class of known interprocedural analysis. Control Flow Analysis in the object oriented setting aims at det...... at determining run-time types of variables, thus allowing to possibly targeted method implementations. We present a flow sensitive analysis that allows separate handling of libraries and thereby efficient analysis of whole programs....

  16. PIE Nacelle Flow Analysis and TCA Inlet Flow Quality Assessment

    Science.gov (United States)

    Shieh, C. F.; Arslan, Alan; Sundaran, P.; Kim, Suk; Won, Mark J.

    1999-01-01

    This presentation includes three topics: (1) Analysis of isolated boattail drag; (2) Computation of Technology Concept Airplane (TCA)-installed nacelle effects on aerodynamic performance; and (3) Assessment of TCA inlet flow quality.

  17. Web-Oriented Visual Performance Analysis Tool for HPC: THPTiii

    Institute of Scientific and Technical Information of China (English)

    SHIPeizhi; LISanli

    2003-01-01

    Improving the low efficiency of most parallel applications with performance tool is an important issue in high performance computing. Performance tool, which usually collects and analyzes performance data, is an effective way of improving performance. This paper explores both the collecting and analysis of performance data, and two innovation ideas are proposed: both types of runtime performance data, concerning both system load and application behavior, should be collected simultaneously, which requires multiple instrument flow and low probing cost; and the performance analysis should be Weboriented, which can exploit the excellent portability and usability brought by Internet. This paper presents a Weboriented HPC (High performance computing) performance tool, which can collect information about both resource utilization, including the utilizing ratio of CPU and memory, and the program behavior during runtime, including the statuses such as sending and computing, and visualize the information in the users' browser window with JAVA applets in multiple filters and multiple views. Furthermore, this performance tool exposes the data dependency between components and provides an entry of task scheduling. With this performance tool, programmers can monitor the runtime state of the application, analyze the relationship between program process and system load, find out the performance bottleneck, and improve the performance of the application at last.

  18. FASB 95--a tool for identifying cash flow problems.

    Science.gov (United States)

    Edwards, D E; Hauser, R C

    1989-06-01

    Increasing expenditures and longer collection periods for receivables are symptoms of the cash flow problem that many not-for-profit healthcare organizations face. The FASB 95 provisions for a cash flow statement could help many of these organizations deal with their cash flow crises. The statement allows managers to evaluate their institution's financial performance, providing timely information that may make the difference in the struggle to maintain an adequate cash position.

  19. Go with the flow: an updated tool for detecting molecules.

    OpenAIRE

    Frazer, L

    2000-01-01

    In the early 1970, researchers at Los Alamos National Laboratories developed the flow cytometer, a device that allows for the identification of unknown cells. In a flow cytometer, a single-cell suspension is passed in a continuous flow through a laser beam, with each cell scattering the light in a characteristic manner. A few years ago, researchers at Los Alamos began another project, refining the capabilities of the flow cytometer so that it could analyze not a single cell but a single molec...

  20. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  1. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  2. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...... are specified using the temporal logic Probabilistic Computation Tree Logic (PCTL) and we employ stochastic model checking, by means of the model checker PRISM, to compute their exact values. We present a simplified example of a distributed stochastic system where we determine a reachability property...

  3. Multifractal Analysis for the Teichmueller Flow

    Energy Technology Data Exchange (ETDEWEB)

    Meson, Alejandro M., E-mail: meson@iflysib.unlp.edu.ar; Vericat, Fernando, E-mail: vericat@iflysib.unlp.edu.ar [Instituto de Fisica de Liquidos y Sistemas Biologicos (IFLYSIB) CCT-CONICET, La Plata-UNLP and Grupo de Aplicaciones Matematicas y Estadisticas de la Facultad de Ingenieria (GAMEFI) UNLP (Argentina)

    2012-03-15

    We present a multifractal description for Teichmueller flows. A key ingredient to do this is the Rauzy-Veech-Zorich reduction theory, which allows to treat the problem in the setting of suspension flows over subshifts. To perform the multifractal analysis we implement a thermodynamic formalism for suspension flows over countable alphabet subshifts a bit different from that developed by Barreira and Iommi.

  4. RSAT 2015: Regulatory Sequence Analysis Tools.

    Science.gov (United States)

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  5. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  6. Improved Modeling Tools For High Speed Reacting Flows

    Science.gov (United States)

    2006-09-01

    putting the tools in place and operating them as a single system on the Beowulf cluster which was purposely built by Blue Blanket LLC (BBLLC) for this...a commercial tool, available from the Program Development Company (PDC). Computational Cluster An eight processor cluster was leased from BBLLC...SBIR I - FA8650-05-M-2594 3 Software Installation Once this cluster was in place, the off-the-shelf software was installed and tested

  7. Subchannel analysis with flow blockages

    Science.gov (United States)

    Sabotinov, L.

    1985-05-01

    The steady state single-phase three-dimensional flow in the rod bundle geometry of a nuclear pressurized water reactor was calculated with the PHOENICS 84 program. Flow blockages, which may occur under accident conditions, are simulated. Results show that PHOENICS-84 can be applied to calculation of the three-dimensional fields of velocities in fuel rod bundles containing complete flow blockages in cells. The code can treat recirculation zones.

  8. REDCAT: a residual dipolar coupling analysis tool.

    Science.gov (United States)

    Valafar, Homayoun; Prestegard, James H

    2004-04-01

    Recent advancements in the utilization of residual dipolar couplings (RDCs) as a means of structure validation and elucidation have demonstrated the need for, not only a more user friendly, but also a more powerful RDC analysis tool. In this paper, we introduce a software package named REsidual Dipolar Coupling Analysis Tool (REDCAT) designed to address the above issues. REDCAT is a user-friendly program with its graphical-user-interface developed in Tcl/Tk, which is highly portable. Furthermore, the computational engine behind this GUI is written in C/C++ and its computational performance is therefore excellent. The modular implementation of REDCAT's algorithms, with separation of the computational engine from the graphical engine allows for flexible and easy command line interaction. This feature can be utilized for the design of automated data analysis sessions. Furthermore, this software package is portable to Linux clusters for high throughput applications. In addition to basic utilities to solve for order tensors and back calculate couplings from a given order tensor and proposed structure, a number of improved algorithms have been incorporated. These include the proper sampling of the Null-space (when the system of linear equations is under-determined), more sophisticated filters for invalid order-tensor identification, error analysis for the identification of the problematic measurements and simulation of the effects of dynamic averaging processes.

  9. ANALYSIS AND ACCOUNTING OF TOTAL CASH FLOW

    Directory of Open Access Journals (Sweden)

    MELANIA ELENA MICULEAC

    2012-01-01

    Full Text Available In order to reach the objective of supplying some relevant information regarding the liquidity inflows and outflows during a financial exercise, the total cash flow analysis must include the analysis of result cashable from operation, of payments and receipts related to the investment and of financing decisions of the last exercise, as well as the analysis of treasury variation (of cash items. The management of total cash flows ensures the correlation of current liquidness flows as consequence of receipts with the payments ’flows, in order to provide payment continuity of mature obligations.

  10. ANALYSIS AND ACCOUNTING OF TOTAL CASH FLOW

    National Research Council Canada - National Science Library

    MELANIA ELENA MICULEAC

    2012-01-01

    In order to reach the objective of supplying some relevant information regarding the liquidity inflows and outflows during a financial exercise, the total cash flow analysis must include the analysis...

  11. Basic Functional Analysis Puzzles of Spectral Flow

    DEFF Research Database (Denmark)

    Booss-Bavnbek, Bernhelm

    2011-01-01

    We explain an array of basic functional analysis puzzles on the way to general spectral flow formulae and indicate a direction of future topological research for dealing with these puzzles.......We explain an array of basic functional analysis puzzles on the way to general spectral flow formulae and indicate a direction of future topological research for dealing with these puzzles....

  12. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  13. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  14. Security constrained optimal power flow by modern optimization tools

    African Journals Online (AJOL)

    The fertilization is divided into self and Cross Pollination. The self ... Blossom steadiness can be considered as the generation l. 4. ..... discovered considering the base case is 801.8436, and this esteem is .... Gaing Z., and ChangR., 2006, Security-constrained optimal power flow by mixed-integer genetic algorithm with.

  15. Debris flow early warning systems in Norway: organization and tools

    Science.gov (United States)

    Kleivane, I.; Colleuille, H.; Haugen, L. E.; Alve Glad, P.; Devoli, G.

    2012-04-01

    In Norway, shallow slides and debris flows occur as a combination of high-intensity precipitation, snowmelt, high groundwater level and saturated soil. Many events have occurred in the last decades and are often associated with (or related to) floods events, especially in the Southern of Norway, causing significant damages to roads, railway lines, buildings, and other infrastructures (i.e November 2000; August 2003; September 2005; November 2005; Mai 2008; June and Desember 2011). Since 1989 the Norwegian Water Resources and Energy Directorate (NVE) has had an operational 24 hour flood forecasting system for the entire country. From 2009 NVE is also responsible to assist regions and municipalities in the prevention of disasters posed by landslides and snow avalanches. Besides assisting the municipalities through implementation of digital landslides inventories, susceptibility and hazard mapping, areal planning, preparation of guidelines, realization of mitigation measures and helping during emergencies, NVE is developing a regional scale debris flow warning system that use hydrological models that are already available in the flood warning systems. It is well known that the application of rainfall thresholds is not sufficient to evaluate the hazard for debris flows and shallow slides, and soil moisture conditions play a crucial role in the triggering conditions. The information on simulated soil and groundwater conditions and water supply (rain and snowmelt) based on weather forecast, have proved to be useful variables that indicate the potential occurrence of debris flows and shallow slides. Forecasts of runoff and freezing-thawing are also valuable information. The early warning system is using real-time measurements (Discharge; Groundwater level; Soil water content and soil temperature; Snow water equivalent; Meteorological data) and model simulations (a spatially distributed version of the HBV-model and an adapted version of 1-D soil water and energy balance

  16. Sustainability Tools Inventory - Initial Gaps Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influenc...

  17. Analysis of Cisco Open Network Environment (ONE) OpenFlow Controller Implementation

    Science.gov (United States)

    2014-08-01

    or match criteria to ports that are not connected) as well as the ability to install and uninstall a flow from switches without deleting it entirely...Port recognition Not explicit Yes Install/ uninstall flows NA Yes Set dynamic flows Requires code manipulation Yes 6 5.2 OpenFlow Control Channel...Setup The initial handshake between the controller and switch was monitored using packet analysis tools capable of understanding OpenFlow protocol

  18. GIS-based hydrogeochemical analysis tools (QUIMET)

    Science.gov (United States)

    Velasco, V.; Tubau, I.; Vázquez-Suñè, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernàndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.

    2014-09-01

    A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

  19. FlowPing - The New Tool for Throughput and Stress Testing

    Directory of Open Access Journals (Sweden)

    Ondrej Vondrous

    2015-01-01

    Full Text Available This article presents a new tool for network throughput and stress testing. The FlowPing tool is easy to use, and its basic output is very similar to standard Linux ping application. The FlowPing tool is not limited to reach-ability or round trip time testing but is capable of complex UDP based throughput stress testing with rich reporting capabilities on client and server sides. Our new tool implements features, which allow the user to perform tests with variable packet size and traffic rate. All these features can be used in one single test run. This allows the user to use and develop new methodologies for network throughput and stress testing. With the FlowPing tool, it is easy to perform the test with the slowly increasing the amount of network traffic and monitor the behavior of network when the congestion occurs.

  20. Content analysis in information flows

    Energy Technology Data Exchange (ETDEWEB)

    Grusho, Alexander A. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation); Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow (Russian Federation); Grusho, Nick A.; Timonina, Elena E. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation)

    2016-06-08

    The paper deals with architecture of content recognition system. To analyze the problem the stochastic model of content recognition in information flows was built. We proved that under certain conditions it is possible to solve correctly a part of the problem with probability 1, viewing a finite section of the information flow. That means that good architecture consists of two steps. The first step determines correctly certain subsets of contents, while the second step may demand much more time for true decision.

  1. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2009-01-01

    Simulation tools are becoming available which predict the heat and moisture conditions in the indoor environment as well as in the envelope of buildings, and thus it has become possible to consider the important interaction between the different components of buildings and the different physical...... phenomena which occur. However, there is still room for further development of such tools. This paper will present an attempt to integrate modelling of air flows in building envelopes into a whole building hygrothermal simulation tool. Two kinds of air flows have been considered: 1. Air flow in ventilated...... cavity such as in the exterior cladding of building envelopes, i.e. a flow which is parallel to the construction plane. 2. Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the construction plane. The new models make it possible to predict the thermal...

  2. Computational analysis of the flow field downstream of flow conditioners

    Energy Technology Data Exchange (ETDEWEB)

    Erdal, Asbjoern

    1997-12-31

    Technological innovations are essential for maintaining the competitiveness for the gas companies and here metering technology is one important area. This thesis shows that computational fluid dynamic techniques can be a valuable tool for examination of several parameters that may affect the performance of a flow conditioner (FC). Previous design methods, such as screen theory, could not provide fundamental understanding of how a FC works. The thesis shows, among other things, that the flow pattern through a complex geometry, like a 19-hole plate FC, can be simulated with good accuracy by a k-{epsilon} turbulence model. The calculations illuminate how variations in pressure drop, overall porosity, grading of porosity across the cross-section and the number of holes affects the performance of FCs. These questions have been studied experimentally by researchers for a long time. Now an understanding of the important mechanisms behind efficient FCs emerges from the predictions. 179 ref., 110 figs., 8 tabs.

  3. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  4. FlowCal: A User-Friendly, Open Source Software Tool for Automatically Converting Flow Cytometry Data from Arbitrary to Calibrated Units.

    Science.gov (United States)

    Castillo-Hair, Sebastian M; Sexton, John T; Landry, Brian P; Olson, Evan J; Igoshin, Oleg A; Tabor, Jeffrey J

    2016-07-15

    Flow cytometry is widely used to measure gene expression and other molecular biological processes with single cell resolution via fluorescent probes. Flow cytometers output data in arbitrary units (a.u.) that vary with the probe, instrument, and settings. Arbitrary units can be converted to the calibrated unit molecules of equivalent fluorophore (MEF) using commercially available calibration particles. However, there is no convenient, nonproprietary tool available to perform this calibration. Consequently, most researchers report data in a.u., limiting interpretation. Here, we report a software tool named FlowCal to overcome current limitations. FlowCal can be run using an intuitive Microsoft Excel interface, or customizable Python scripts. The software accepts Flow Cytometry Standard (FCS) files as inputs and is compatible with different calibration particles, fluorescent probes, and cell types. Additionally, FlowCal automatically gates data, calculates common statistics, and produces publication quality plots. We validate FlowCal by calibrating a.u. measurements of E. coli expressing superfolder GFP (sfGFP) collected at 10 different detector sensitivity (gain) settings to a single MEF value. Additionally, we reduce day-to-day variability in replicate E. coli sfGFP expression measurements due to instrument drift by 33%, and calibrate S. cerevisiae Venus expression data to MEF units. Finally, we demonstrate a simple method for using FlowCal to calibrate fluorescence units across different cytometers. FlowCal should ease the quantitative analysis of flow cytometry data within and across laboratories and facilitate the adoption of standard fluorescence units in synthetic biology and beyond.

  5. Robust-mode analysis of hydrodynamic flows

    Science.gov (United States)

    Roy, Sukesh; Gord, James R.; Hua, Jia-Chen; Gunaratne, Gemunu H.

    2017-04-01

    The emergence of techniques to extract high-frequency high-resolution data introduces a new avenue for modal decomposition to assess the underlying dynamics, especially of complex flows. However, this task requires the differentiation of robust, repeatable flow constituents from noise and other irregular features of a flow. Traditional approaches involving low-pass filtering and principle components analysis have shortcomings. The approach outlined here, referred to as robust-mode analysis, is based on Koopman decomposition. Three applications to (a) a counter-rotating cellular flame state, (b) variations in financial markets, and (c) turbulent injector flows are provided.

  6. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  7. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  8. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  9. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    Energy Technology Data Exchange (ETDEWEB)

    Grout, R. W.

    2013-10-01

    This report describes IHT, a toolkit for computing radiative heat exchange between particles. Well suited for insolation absorption computations, it is also has potential applications in combustion (sooting flames), biomass gasification processes and similar processes. The algorithm is based on the 'Photon Monte Carlo' approach and implemented in a library that can be interfaced with a variety of computational fluid dynamics codes to analyze radiative heat transfer in particle-laden flows. The emphasis in this report is on the data structures and organization of IHT for developers seeking to use the IHT toolkit to add Photon Monte Carlo capabilities to their own codes.

  10. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  11. Flow Analysis for the Falkner–Skan Wedge Flow

    DEFF Research Database (Denmark)

    Bararnia, H; Haghparast, N; Miansari, M

    2012-01-01

    the constant coefficients in the approximated solution. The effects of the polynomial terms of HAM are considered and the accuracy of the results is shown, which increases with the increasing polynomial terms of HAM. Analytical results for the dimensionless velocity and temperature profiles of the wedge flow......In this article an analytical technique, namely the homotopy analysis method (HAM), is applied to solve the momentum and energy equations in the case of a two-dimensional incompressible flow passing over a wedge. The trail and error method and Padé approximation strategies have been used to obtain...

  12. Use of software tools for calculating flow accelerated corrosion of nuclear power plant equipment and pipelines

    Science.gov (United States)

    Naftal', M. M.; Baranenko, V. I.; Gulina, O. M.

    2014-06-01

    The results obtained from calculations of flow accelerated corrosion of equipment and pipelines operating at nuclear power plants constructed on the basis of PWR, VVER, and RBMK reactors carried out using the EKI-02 and EKI-03 software tools are presented. It is shown that the calculation error does not exceed its value indicated in the qualification certificates for these software tools. It is pointed out that calculations aimed at predicting the service life of pipelines and efficient surveillance of flow accelerated corrosion wear are hardly possible without using the above-mentioned software tools.

  13. FLOW STRESS MODEL FOR HARD MACHINING OF AISI H13 WORK TOOL STEEL

    Institute of Scientific and Technical Information of China (English)

    H. Yan; J. Hua; R. Shivpuri

    2005-01-01

    An approach is presented to characterize the stress response of workpiece in hard machining,accounted for the effect of the initial workpiece hardness, temperature, strain and strain rate on flow stress. AISI H13 work tool steel was chosen to verify this methodology. The proposed flow stress model demonstrates a good agreement with data collected from published experiments.Therefore, the proposed model can be used to predict the corresponding flow stress-strain response of AISI H13 work tool steel with variation of the initial workpiece hardness in hard machining.

  14. Analysis of Cortical Flow Models In Vivo

    Science.gov (United States)

    Benink, Hélène A.; Mandato, Craig A.; Bement, William M.

    2000-01-01

    Cortical flow, the directed movement of cortical F-actin and cortical organelles, is a basic cellular motility process. Microtubules are thought to somehow direct cortical flow, but whether they do so by stimulating or inhibiting contraction of the cortical actin cytoskeleton is the subject of debate. Treatment of Xenopus oocytes with phorbol 12-myristate 13-acetate (PMA) triggers cortical flow toward the animal pole of the oocyte; this flow is suppressed by microtubules. To determine how this suppression occurs and whether it can control the direction of cortical flow, oocytes were subjected to localized manipulation of either the contractile stimulus (PMA) or microtubules. Localized PMA application resulted in redirection of cortical flow toward the site of application, as judged by movement of cortical pigment granules, cortical F-actin, and cortical myosin-2A. Such redirected flow was accelerated by microtubule depolymerization, showing that the suppression of cortical flow by microtubules is independent of the direction of flow. Direct observation of cortical F-actin by time-lapse confocal analysis in combination with photobleaching showed that cortical flow is driven by contraction of the cortical F-actin network and that microtubules suppress this contraction. The oocyte germinal vesicle serves as a microtubule organizing center in Xenopus oocytes; experimental displacement of the germinal vesicle toward the animal pole resulted in localized flow away from the animal pole. The results show that 1) cortical flow is directed toward areas of localized contraction of the cortical F-actin cytoskeleton; 2) microtubules suppress cortical flow by inhibiting contraction of the cortical F-actin cytoskeleton; and 3) localized, microtubule-dependent suppression of actomyosin-based contraction can control the direction of cortical flow. We discuss these findings in light of current models of cortical flow. PMID:10930453

  15. Flow Injection Analysis in Industrial Biotechnology

    DEFF Research Database (Denmark)

    Hansen, Elo Harald; Miró, Manuel

    2009-01-01

    Flow injection analysis (FIA) is an analytical chemical continuous-flow (CF) method which in contrast to traditional CF-procedures does not rely on complete physical mixing (homogenisation) of the sample and the reagent(s) or on attaining chemical equilibria of the chemical reactions involved. Ex...

  16. Using the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel

    2017-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  17. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  18. Ultrasonic vibrating system design and tool analysis

    Institute of Scientific and Technical Information of China (English)

    Kei-Lin KUO

    2009-01-01

    The applications of ultrasonic vibrations for material removal processes exist predominantly in the area of vertical processing of hard and brittle materials. This is because the power generated by vertical vibrating oscillators generates the greatest direct penetration, in order to conduct material removal on workpieces by grains. However, for milling processes, vertical vibrating power has to be transformed into lateral (horizontal) vibration to produce the required horizontal cutting force. The objective of this study is to make use of ultrasonic lateral transformation theory to optimize processing efficiency, through the use of the finite element method for design and analysis of the milling tool. In addition, changes can be made to the existing vibrating system to generate best performance under consistent conditions, namely, using the same piezoelectric ceramics.

  19. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  20. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  1. Timeline analysis tools for law enforcement

    Science.gov (United States)

    Mucks, John

    1997-02-01

    The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

  2. Extension of a System Level Tool for Component Level Analysis

    Science.gov (United States)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  3. Analysis of Interregional Commodity Flows

    Directory of Open Access Journals (Sweden)

    Wirach Hirun

    2010-01-01

    Full Text Available Problem statement: Commodity Flow Survey (CFS was launched to collect comprehensive freight flow data throughout the kingdom of Thailand. The survey’s database is the most complete collection of commodity flow data in Thailand. The need to reveal interregional freight characteristics using available data from the CFS led to the objectives of this research. Approach: An origin destination matrix based on province was calibrated using a flexible Box-Cox function form. It used maximum likelihood and the backward method for calibration and Root Mean Square Error (RMSE and Mean Relative Error (MRE to verify the model’s performance. Independent variables were classified into three groups: origin variable, destination variable and geographic variable. The origin variable represented the behavior of the trip as generated at the place of origin. Some consumption occurred at the origin. The employment and the average plant size variables were selected for potential productivity while personal income per capita and total populations were included to explain consumption behavior at the origin. Personal income per capita and total populations were selected for destination variables which act as proxy for final demand at the destination. The third category, distance, was the most conventional friction variable for geographical variables. Results: The calibrated model revealed that origin income, origin average plant size and origin population performed poorly. Therefore these variables were eliminated. The best developed model included four strongly significant variables at a 5% level: origin employment, destination population, destination income per capita and distance. Conclusion: The results showed that the selected variables and the Box-Cox functional form were successful in explaining behavior of interregional freight transportation in Thailand. The developed model was the first interregional freight transportation model to be

  4. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  5. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  6. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a comple...

  7. LFSTAT - An R-Package for Low-Flow Analysis

    Science.gov (United States)

    Koffler, D.; Laaha, G.

    2012-04-01

    When analysing daily streamflow data focusing on low flow and drought, the state of the art is well documented in the Manual on Low-Flow Estimation and Prediction [1] published by the WMO. While it is clear what has to be done, it is not so clear how to preform the analysis and make the calculation as reproducible as possible. Our software solution expands the high preforming statistical open source software package R to analyse daily stream flow data focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) to analyse data in R. Functionality includes estimation of the most important low-flow indices. Beside standardly used flow indices also BFI and Recession constants can be computed. The main applications of L-moment based Extreme value analysis and regional frequency analysis (RFA) are available. Calculation of streamflow deficits is another important feature. The most common graphics are prepared and can easily be modified according to the users preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, flow duration curves as well as double mass curves just to name a few. The package uses a S3-class called lfobj (low-flow objects). Once this objects are created, analysis can be preformed by mouse-click, and a script can be saved to make the analysis easy reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions. Future plans include e.g. report export in odt-file using odf-weave. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package is designed for hydrological research and water management practice, but can also be used in teaching students the first steps in low-flow hydrology.

  8. Riparian Cottonwood Ecosystems and Regulated Flows in Kootenai and Yakima Sub-Basins : Volume III (Overview and Tools).

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, Bob; Braatne, Jeffrey H.

    2001-10-01

    Riparian vegetation and especially cottonwood and willow plant communities are dependent on normative flows and especially, spring freshette, to provide conditions for recruitment. These plant communities therefore share much in common with a range of fish species that require natural flow conditions to stimulate reproduction. We applied tools and techniques developed in other areas to assess riparian vegetation in two very different sub-basins within the Columbia Basin. Our objectives were to: Document the historic impact of human activity on alluvial floodplain areas in both sub-basins; Provide an analysis of the impacts of flow regulation on riparian vegetation in two systems with very different flow regulation systems; Demonstrate that altered spring flows will, in fact, result in recruitment to cottonwood stands, given other land uses impacts on each river and the limitations imposed by other flow requirements; and Assess the applicability of remote sensing tools for documenting the distribution and health of cottonwood stands and riparian vegetation that can be used in other sub-basins.

  9. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  10. Analysis of flow instabilities in forced-convection steam generator

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Because of the practical importance of two-phase instabilities, substantial efforts have been made to date to understand the physical phenomena governing such instabilities and to develop computational tools to model the dynamics. The purpose of this study is to present a numerical model for the analysis of flow-induced instabilities in forced-convection steam generator. The model is based on the assumption of homogeneous two-phase flow and thermodynamic equilibrium of the phases. The thermal capacity of the heater wall has been included in the analysis. The model is used to analyze the flow instabilities in the steam generator and to study the effects of system pressure, mass flux, inlet temperature and inlet/outlet restriction, gap size, the ratio of do /di, and the ratio of qi/qo on the system behavior.

  11. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  12. Effect of Pin Tool Shape on Metal Flow During Friction Stir Welding

    Science.gov (United States)

    McClure, J. C.; Coronado, E.; Aloor, S.; Nowak, B.; Murr, L. M.; Nunes, Arthur C., Jr.; Munafo, Paul M. (Technical Monitor)

    2002-01-01

    It has been shown that metal moves behind the rotating Friction Stir Pin Tool in two separate currents or streams. One current, mostly on the advancing side, enters a zone of material that rotates with the pin tool for one or more revolutions and eventually is abandoned behind the pin tool in crescent-shaped pieces. The other current, largely on the retreating side of the pin tool is moved by a wiping process to the back of the pin tool and fills in between the pieces of the rotational zone that have been shed by the rotational zone. This process was studied by using a faying surface copper trace to clarify the metal flow. Welds were made with pin tools having various thread pitches. Decreasing the thread pitch causes the large scale top-to-bottorn flow to break up into multiple vortices along the pin and an unthreaded pin tool provides insufficient vertical motion for there to be a stable rotational zone and flow of material via the rotational zone is not possible leading to porosity on the advancing side of the weld.

  13. Detecting Human Hydrologic Alteration from Diversion Hydropower Requires Universal Flow Prediction Tools: A Proposed Framework for Flow Prediction in Poorly-gauged, Regulated Rivers

    Science.gov (United States)

    Kibler, K. M.; Alipour, M.

    2016-12-01

    Achieving the universal energy access Sustainable Development Goal will require great investment in renewable energy infrastructure in the developing world. Much growth in the renewable sector will come from new hydropower projects, including small and diversion hydropower in remote and mountainous regions. Yet, human impacts to hydrological systems from diversion hydropower are poorly described. Diversion hydropower is often implemented in ungauged rivers, thus detection of impact requires flow analysis tools suited to prediction in poorly-gauged and human-altered catchments. We conduct a comprehensive analysis of hydrologic alteration in 32 rivers developed with diversion hydropower in southwestern China. As flow data are sparse, we devise an approach for estimating streamflow during pre- and post-development periods, drawing upon a decade of research into prediction in ungauged basins. We apply a rainfall-runoff model, parameterized and forced exclusively with global-scale data, in hydrologically-similar gauged and ungauged catchments. Uncertain "soft" data are incorporated through fuzzy numbers and confidence-based weighting, and a multi-criteria objective function is applied to evaluate model performance. Testing indicates that the proposed framework returns superior performance (NSE = 0.77) as compared to models parameterized by rote calibration (NSE = 0.62). Confident that the models are providing `the right answer for the right reasons', our analysis of hydrologic alteration based on simulated flows indicates statistically significant hydrologic effects of diversion hydropower across many rivers. Mean annual flows, 7-day minimum and 7-day maximum flows decreased. Frequency and duration of flow exceeding Q25 decreased while duration of flows sustained below the Q75 increased substantially. Hydrograph rise and fall rates and flow constancy increased. The proposed methodology may be applied to improve diversion hydropower design in data-limited regions.

  14. AutoGate: automating analysis of flow cytometry data.

    Science.gov (United States)

    Meehan, Stephen; Walther, Guenther; Moore, Wayne; Orlova, Darya; Meehan, Connor; Parks, David; Ghosn, Eliver; Philips, Megan; Mitsunaga, Erin; Waters, Jeffrey; Kantor, Aaron; Okamura, Ross; Owumi, Solomon; Yang, Yang; Herzenberg, Leonard A; Herzenberg, Leonore A

    2014-05-01

    Nowadays, one can hardly imagine biology and medicine without flow cytometry to measure CD4 T cell counts in HIV, follow bone marrow transplant patients, characterize leukemias, etc. Similarly, without flow cytometry, there would be a bleak future for stem cell deployment, HIV drug development and full characterization of the cells and cell interactions in the immune system. But while flow instruments have improved markedly, the development of automated tools for processing and analyzing flow data has lagged sorely behind. To address this deficit, we have developed automated flow analysis software technology, provisionally named AutoComp and AutoGate. AutoComp acquires sample and reagent labels from users or flow data files, and uses this information to complete the flow data compensation task. AutoGate replaces the manual subsetting capabilities provided by current analysis packages with newly defined statistical algorithms that automatically and accurately detect, display and delineate subsets in well-labeled and well-recognized formats (histograms, contour and dot plots). Users guide analyses by successively specifying axes (flow parameters) for data subset displays and selecting statistically defined subsets to be used for the next analysis round. Ultimately, this process generates analysis "trees" that can be applied to automatically guide analyses for similar samples. The first AutoComp/AutoGate version is currently in the hands of a small group of users at Stanford, Emory and NIH. When this "early adopter" phase is complete, the authors expect to distribute the software free of charge to .edu, .org and .gov users.

  15. Development of Integrated Protein Analysis Tool

    Directory of Open Access Journals (Sweden)

    Poorna Satyanarayana Boyidi,

    2010-05-01

    Full Text Available We present an “Integrated Protein Analysis Tool(IPAT” that is able to perform the following tasks in segregating and annotating genomic data: Protein Editor enables the entry of nucleotide/ aminoacid sequences Utilities :IPAT enables to conversion of given nucleotide sequence to equivalent amino acid sequence: Secondary Structure Prediction is possible using three algorithms (GOR-I Gibrat Method and DPM (Double Prediction Method with graphical display. Profiles and properties: allow calculating eight physico-chemical profiles and properties, viz Hydrophobicity, Hydrophilicity, Antigenicity, Transmembranous regions , Solvent Accessibility, Molecular Weight, Absorption factor and Amino Acid Content. IPAT has a provision for viewing Helical-Wheel Projection of a selected region of a given protein sequence and 2D representation of alphacarbon IPAT was developed using the UML (Unified Modeling Language for modeling the project elements, coded in Java, and subjected to unit testing, path testing, and integration testing.This project mainly concentrates on Butyrylcholinesterase to predict secondary structure and its physicochemical profiles, properties.

  16. A biological tool to assess flow connectivity in reference temporary streams from the Mediterranean Basin

    Energy Technology Data Exchange (ETDEWEB)

    Cid, N., E-mail: ncid@ub.edu [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain); Verkaik, I. [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain); García-Roger, E.M. [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain); Institut Cavanilles de Biodiversitat i Biologia Evolutiva, Universitat de València (Spain); Rieradevall, M.; Bonada, N. [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain); Sánchez-Montoya, M.M. [Department of Ecology and Hydrology, Regional Campus of International Excellence “Campus Mare Nostrum”—University of Murcia (Spain); Leibniz-Institute of Freshwater Ecology and Inland Fisheries (IGB), Berlin (Germany); Gómez, R.; Suárez, M.L.; Vidal-Abarca, M.R. [Department of Ecology and Hydrology, Regional Campus of International Excellence “Campus Mare Nostrum”—University of Murcia (Spain); Demartini, D.; Buffagni, A.; Erba, S. [Instituto di Ricerca Sulle Acque (CNR-IRSA) (Italy); Karaouzas, I.; Skoulikidis, N. [Hellenic Center for Marine Research (HCMR) (Greece); Prat, N. [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain)

    2016-01-01

    Many streams in the Mediterranean Basin have temporary flow regimes. While timing for seasonal drought is predictable, they undergo strong inter-annual variability in flow intensity. This high hydrological variability and associated ecological responses challenge the ecological status assessment of temporary streams, particularly when setting reference conditions. This study examined the effects of flow connectivity in aquatic macroinvertebrates from seven reference temporary streams across the Mediterranean Basin where hydrological variability and flow conditions are well studied. We tested for the effect of flow cessation on two streamflow indices and on community composition, and, by performing random forest and classification tree analyses we identified important biological predictors for classifying the aquatic state either as flowing or disconnected pools. Flow cessation was critical for one of the streamflow indices studied and for community composition. Macroinvertebrate families found to be important for classifying the aquatic state were Hydrophilidae, Simuliidae, Hydropsychidae, Planorbiidae, Heptageniidae and Gerridae. For biological traits, trait categories associated to feeding habits, food, locomotion and substrate relation were the most important and provided more accurate predictions compared to taxonomy. A combination of selected metrics and associated thresholds based on the most important biological predictors (i.e. Bio-AS Tool) were proposed in order to assess the aquatic state in reference temporary streams, especially in the absence of hydrological data. Although further development is needed, the tool can be of particular interest for monitoring, restoration, and conservation purposes, representing an important step towards an adequate management of temporary rivers not only in the Mediterranean Basin but also in other regions vulnerable to the effects of climate change. - Highlights: • The effect of flow connectivity on macroinvertebrate

  17. Model-based calculating tool for pollen-mediated gene flow frequencies in plants.

    Science.gov (United States)

    Lei, Wang; Bao-Rong, Lu

    2016-12-30

    The potential social-economic and environmental impacts caused by transgene flow from genetically engineered (GE) crops have stimulated worldwide biosafety concerns. To determine transgene flow frequencies resulted from pollination is the first critical step for assessing such impacts, in addition to the determination of transgene expression and fitness in crop-wild hybrid descendants. Two methods are commonly used to estimate pollen-mediated gene flow (PMGF) frequencies: field experimenting and mathematical modeling. Field experiments can provide relatively accurate results but are time/resource consuming. Modeling offers an effective complement for PMGF experimental assessment. However, many published models describe PMGF by mathematical equations and are practically not easy to use. To increase the application of PMGF modeling for the estimation of transgene flow, we established a tool to calculate PMGF frequencies based on a quasi-mechanistic PMGF model for wind-pollination species. This tool includes a calculating program displayed by an easy-operating interface. PMGF frequencies of different plant species can be quickly calculated under different environmental conditions by including a number of biological and wind speed parameters that can be measured in the fields/laboratories or obtained from published data. The tool is freely available in the public domain (http://ecology.fudan.edu.cn/userfiles/cn/files/Tool_Manual.zip). Case studies including rice, wheat, and maize demonstrated similar results between the calculated frequencies based on this tool and those from published PMGF data. This PMGF calculating tool will provide useful information for assessing and monitoring social-economic and environmental impacts caused by transgene flow from GE crops. This tool can also be applied to determine the isolation distances between GE and non-GE crops in a coexistence agro-ecosystem, and to ensure the purity of certified seeds by setting proper isolation distances

  18. Influence of the Tool Shoulder Contact Conditions on the Material Flow During Friction Stir Welding

    Science.gov (United States)

    Doude, Haley R.; Schneider, Judy A.; Nunes, Arthur C.

    2014-09-01

    Friction stir welding (FSWing) is a solid-state joining process of special interest in joining alloys that are traditionally difficult to fusion weld. In order to optimize the process, various numeric modeling approaches have been pursued. Of importance to furthering modeling efforts is a better understanding of the contact conditions between the workpiece and the weld tool. Both theoretical and experimental studies indicate the contact conditions between the workpiece and weld tool are unknown, possibly varying during the FSW process. To provide insight into the contact conditions, this study characterizes the material flow in the FSW nugget by embedding a lead (Pb) wire that melted at the FSWing temperature of aluminum alloy 2195. The Pb trace provided evidence of changes in material flow characteristics which were attributed to changes in the contact conditions between the weld tool and workpiece, as driven by temperature, as the tool travels the length of a weld seam.

  19. Control Flow Analysis for BioAmbients

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis; Priami, C.

    2007-01-01

    This paper presents a static analysis for investigating properties of biological systems specified in BioAmbients. We exploit the control flow analysis to decode the bindings of variables induced by communications and to build a relation of the ambients that can interact with each other. We...

  20. The Three Generations of Flow Injection Analysis

    DEFF Research Database (Denmark)

    Hansen, Elo Harald; Wang, Jianhua

    2004-01-01

    The characteristics of the three generations of flow injection analysis, that is, FIA, sequential injection analysis (SIA), and bead injection-lab-on-valve (BI-LOV), are briefly outlined, their individual advantages and shortcomings are discussed, and selected practical applications are presented....

  1. Developing a tool for assessing competency in root cause analysis.

    Science.gov (United States)

    Gupta, Priyanka; Varkey, Prathibha

    2009-01-01

    Root cause analysis (RCA) is a tool for identifying the key cause(s) contributing to a sentinel event or near miss. Although training in RCA is gaining popularity in medical education, there is no published literature on valid or reliable methods for assessing competency in the same. A tool for assessing competency in RCA was pilot tested as part of an eight-station Objective Structured Clinical Examination that was conducted at the completion of a three-week quality improvement (QI) curriculum for the Mayo Clinic Preventive Medicine and Endocrinology fellowship programs. As part of the curriculum, fellows completed a QI project to enhance physician communication of the diagnosis and treatment plan at the end of a patient visit. They had a didactic session on RCA, followed by process mapping of the information flow at the project clinic, after which fellows conducted an actual RCA using the Ishikawa fishbone diagram. For the RCA competency assessment, fellows performed an RCA regarding a scenario describing an adverse medication event and provided possible solutions to prevent such errors in the future. All faculty strongly agreed or agreed that they were able to accurately assess competency in RCA using the tool. Interrater reliability for the global competency rating and checklist scoring were 0.96 and 0.85, respectively. Internal consistency (Cronbach's alpha) was 0.76. Six of eight of the fellows found the difficulty level of the test to be optimal. Assessment methods must accompany education programs to ensure that graduates are competent in QI methodologies and are able to apply them effectively in the workplace. The RCA assessment tool was found to be a valid, reliable, feasible, and acceptable method for assessing competency in RCA. Further research is needed to examine its predictive validity and generalizability.

  2. Imaging flow cytometry for phytoplankton analysis.

    Science.gov (United States)

    Dashkova, Veronika; Malashenkov, Dmitry; Poulton, Nicole; Vorobjev, Ivan; Barteneva, Natasha S

    2017-01-01

    This review highlights the concepts and instrumentation of imaging flow cytometry technology and in particular its use for phytoplankton analysis. Imaging flow cytometry, a hybrid technology combining speed and statistical capabilities of flow cytometry with imaging features of microscopy, is rapidly advancing as a cell imaging platform that overcomes many of the limitations of current techniques and contributed significantly to the advancement of phytoplankton analysis in recent years. This review presents the various instrumentation relevant to the field and currently used for assessment of complex phytoplankton communities' composition and abundance, size structure determination, biovolume estimation, detection of harmful algal bloom species, evaluation of viability and metabolic activity and other applications. Also we present our data on viability and metabolic assessment of Aphanizomenon sp. cyanobacteria using Imagestream X Mark II imaging cytometer. Herein, we highlight the immense potential of imaging flow cytometry for microalgal research, but also discuss limitations and future developments.

  3. Recent advances in flow injection analysis.

    Science.gov (United States)

    Trojanowicz, Marek; Kołacińska, Kamila

    2016-04-07

    A dynamic development of methodologies of analytical flow injection measurements during four decades since their invention has reinforced the solid position of flow analysis in the arsenal of techniques and instrumentation of contemporary chemical analysis. With the number of published scientific papers exceeding 20,000, and advanced instrumentation available for environmental, food, and pharmaceutical analysis, flow analysis is well established as an extremely vital field of modern flow chemistry, which is developed simultaneously with methods of chemical synthesis carried out under flow conditions. This review work is based on almost 300 original papers published mostly in the last decade, with special emphasis put on presenting novel achievements from the most recent 2-3 years in order to indicate current development trends of this methodology. Besides the evolution of the design of whole measuring systems, and including especially new applications of various detections methods, several aspects of implications of progress in nanotechnology, and miniaturization of measuring systems for application in different field of modern chemical analysis are also discussed.

  4. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  5. Least Squares Shadowing for Sensitivity Analysis of Turbulent Fluid Flows

    CERN Document Server

    Blonigan, Patrick; Wang, Qiqi

    2014-01-01

    Computational methods for sensitivity analysis are invaluable tools for aerodynamics research and engineering design. However, traditional sensitivity analysis methods break down when applied to long-time averaged quantities in turbulent fluid flow fields, specifically those obtained using high-fidelity turbulence simulations. This is because of a number of dynamical properties of turbulent and chaotic fluid flows, most importantly high sensitivity of the initial value problem, popularly known as the "butterfly effect". The recently developed least squares shadowing (LSS) method avoids the issues encountered by traditional sensitivity analysis methods by approximating the "shadow trajectory" in phase space, avoiding the high sensitivity of the initial value problem. The following paper discusses how the least squares problem associated with LSS is solved. Two methods are presented and are demonstrated on a simulation of homogeneous isotropic turbulence and the Kuramoto-Sivashinsky (KS) equation, a 4th order c...

  6. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    Energy Technology Data Exchange (ETDEWEB)

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  7. Visualization, Selection, and Analysis of Traffic Flows.

    Science.gov (United States)

    Scheepens, Roeland; Hurter, Christophe; van de Wetering, Huub; van Wijk, Jarke J

    2016-01-01

    Visualization of the trajectories of moving objects leads to dense and cluttered images, which hinders exploration and understanding. It also hinders adding additional visual information, such as direction, and makes it difficult to interactively extract traffic flows, i.e., subsets of trajectories. In this paper we present our approach to visualize traffic flows and provide interaction tools to support their exploration. We show an overview of the traffic using a density map. The directions of traffic flows are visualized using a particle system on top of the density map. The user can extract traffic flows using a novel selection widget that allows for the intuitive selection of an area, and filtering on a range of directions and any additional attributes. Using simple, visual set expressions, the user can construct more complicated selections. The dynamic behaviors of selected flows may then be shown in annotation windows in which they can be interactively explored and compared. We validate our approach through use cases where we explore and analyze the temporal behavior of aircraft and vessel trajectories, e.g., landing and takeoff sequences, or the evolution of flight route density. The aircraft use cases have been developed and validated in collaboration with domain experts.

  8. Knowledge base navigator facilitating regional analysis inter-tool communication.

    Energy Technology Data Exchange (ETDEWEB)

    Hampton, Jeffery Wade; Chael, Eric Paul; Hart, Darren M.; Merchant, Bion John; Chown, Matthew N.

    2004-08-01

    To make use of some portions of the National Nuclear Security Administration (NNSA) Knowledge Base (KB) for which no current operational monitoring applications were available, Sandia National Laboratories have developed a set of prototype regional analysis tools (MatSeis, EventID Tool, CodaMag Tool, PhaseMatch Tool, Dendro Tool, Infra Tool, etc.), and we continue to maintain and improve these. Individually, these tools have proven effective in addressing specific monitoring tasks, but collectively their number and variety tend to overwhelm KB users, so we developed another application - the KB Navigator - to launch the tools and facilitate their use for real monitoring tasks. The KB Navigator is a flexible, extensible java application that includes a browser for KB data content, as well as support to launch any of the regional analysis tools. In this paper, we will discuss the latest versions of KB Navigator and the regional analysis tools, with special emphasis on the new overarching inter-tool communication methodology that we have developed to make the KB Navigator and the tools function together seamlessly. We use a peer-to-peer communication model, which allows any tool to communicate with any other. The messages themselves are passed as serialized XML, and the conversion from Java to XML (and vice versa) is done using Java Architecture for XML Binding (JAXB).

  9. Tools for Knowledge Analysis, Synthesis, and Sharing

    Science.gov (United States)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  10. Data flow analysis theory and practice

    CERN Document Server

    Khedker, Uday; Sathe, Bageshri

    2009-01-01

    Data flow analysis is used to discover information for a wide variety of useful applications, ranging from compiler optimizations to software engineering and verification. Modern compilers apply it to produce performance-maximizing code, and software engineers use it to re-engineer or reverse engineer programs and verify the integrity of their programs.  Supplementary Online Materials to Strengthen Understanding Unlike most comparable books, many of which are limited to bit vector frameworks and classical constant propagation, Data Flow Analysis: Theory and Practice offers comprehensive covera

  11. The Fast-Flow Discharge Reactor as an Undergraduate Instructional Tool.

    Science.gov (United States)

    Provencher, G. M.

    1981-01-01

    A fast-flow discharge reactor has been used in an analytical chemistry demonstration of gas phase titration, in inorganic preparative chemistry, and in physical chemistry as a "practice" vacuum line, kinetic reactor, and spectroscopic source as well as an undergraduate research tool. (SK)

  12. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  13. Traking of Laboratory Debris Flow Fronts with Image Analysis

    Science.gov (United States)

    Queiroz de Oliveira, Gustavo; Kulisch, Helmut; Fischer, Jan-Thomas; Scheidl, Christian; Pudasaini, Shiva P.

    2015-04-01

    Image analysis technique is applied to track the time evolution of rapid debris flow fronts and their velocities in laboratory experiments. These experiments are parts of the project avaflow.org that intends to develop a GIS-based open source computational tool to describe wide spectrum of rapid geophysical mass flows, including avalanches and real two-phase debris flows down complex natural slopes. The laboratory model consists of a large rectangular channel 1.4m wide and 10m long, with adjustable inclination and other flow configurations. The setup allows investigate different two phase material compositions including large fluid fractions. The large size enables to transfer the results to large-scale natural events providing increased measurement accuracy. The images are captured by a high speed camera, a standard digital camera. The fronts are tracked by the camera to obtain data in debris flow experiments. The reflectance analysis detects the debris front in every image frame; its presence changes the reflectance at a certain pixel location during the flow. The accuracy of the measurements was improved with a camera calibration procedure. As one of the great problems in imaging and analysis, the systematic distortions of the camera lens are contained in terms of radial and tangential parameters. The calibration procedure estimates the optimal values for these parameters. This allows us to obtain physically correct and undistorted image pixels. Then, we map the images onto a physical model geometry, which is the projective photogrammetry, in which the image coordinates are connected with the object space coordinates of the flow. Finally, the physical model geometry is rewritten in the direct linear transformation form, which allows for the conversion from one to another coordinate system. With our approach, the debris front position can then be estimated by combining the reflectance, calibration and the linear transformation. The consecutive debris front

  14. Through flow analysis within axial flow turbomachinery blade rows

    Science.gov (United States)

    Girigoswami, H.

    1986-09-01

    Using Katsanis' Through Flow Code, inviscid flow through an axial flow compressor rotor blade as well as flow through inlet guide vanes are analyzed and the computed parameters such as meridional velocity distribution, axial velocity distribution along radial lines, and velocity distribution over blade surfaces are presented.

  15. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  16. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  17. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  18. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  19. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  20. FlowMax: A Computational Tool for Maximum Likelihood Deconvolution of CFSE Time Courses.

    Directory of Open Access Journals (Sweden)

    Maxim Nikolaievich Shokhirev

    Full Text Available The immune response is a concerted dynamic multi-cellular process. Upon infection, the dynamics of lymphocyte populations are an aggregate of molecular processes that determine the activation, division, and longevity of individual cells. The timing of these single-cell processes is remarkably widely distributed with some cells undergoing their third division while others undergo their first. High cell-to-cell variability and technical noise pose challenges for interpreting popular dye-dilution experiments objectively. It remains an unresolved challenge to avoid under- or over-interpretation of such data when phenotyping gene-targeted mouse models or patient samples. Here we develop and characterize a computational methodology to parameterize a cell population model in the context of noisy dye-dilution data. To enable objective interpretation of model fits, our method estimates fit sensitivity and redundancy by stochastically sampling the solution landscape, calculating parameter sensitivities, and clustering to determine the maximum-likelihood solution ranges. Our methodology accounts for both technical and biological variability by using a cell fluorescence model as an adaptor during population model fitting, resulting in improved fit accuracy without the need for ad hoc objective functions. We have incorporated our methodology into an integrated phenotyping tool, FlowMax, and used it to analyze B cells from two NFκB knockout mice with distinct phenotypes; we not only confirm previously published findings at a fraction of the expended effort and cost, but reveal a novel phenotype of nfkb1/p105/50 in limiting the proliferative capacity of B cells following B-cell receptor stimulation. In addition to complementing experimental work, FlowMax is suitable for high throughput analysis of dye dilution studies within clinical and pharmacological screens with objective and quantitative conclusions.

  1. Information flow analysis of interactome networks.

    Directory of Open Access Journals (Sweden)

    Patrycja Vasilyev Missiuro

    2009-04-01

    Full Text Available Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins central to the transmission of biological information throughout the network. In the information flow analysis, we represent an interactome network as an electrical circuit, where interactions are modeled as resistors and proteins as interconnecting junctions. Construing the propagation of biological signals as flow of electrical current, our method calculates an information flow score for every protein. Unlike previous metrics of network centrality such as degree or betweenness that only consider topological features, our approach incorporates confidence scores of protein-protein interactions and automatically considers all possible paths in a network when evaluating the importance of each protein. We apply our method to the interactome networks of Saccharomyces cerevisiae and Caenorhabditis elegans. We find that the likelihood of observing lethality and pleiotropy when a protein is eliminated is positively correlated with the protein's information flow score. Even among proteins of low degree or low betweenness, high information scores serve as a strong predictor of loss-of-function lethality or pleiotropy. The correlation between information flow scores and phenotypes supports our hypothesis that the proteins of high information flow reside in central positions in interactome networks. We also show that the ranks of information flow scores are more consistent than that of betweenness when a large amount of noisy data is added to an interactome. Finally, we

  2. Control Volume Analysis, Entropy Balance and the Entropy Production in Flow Systems

    OpenAIRE

    Niven, Robert K.; Noack, Bernd R.

    2014-01-01

    This chapter concerns "control volume analysis", the standard engineering tool for the analysis of flow systems, and its application to entropy balance calculations. Firstly, the principles of control volume analysis are enunciated and applied to flows of conserved quantities (e.g. mass, momentum, energy) through a control volume, giving integral (Reynolds transport theorem) and differential forms of the conservation equations. Several definitions of steady state are discussed. The concept of...

  3. Animation of interactive fluid flow visualization tools on a data parallel machine

    Energy Technology Data Exchange (ETDEWEB)

    Sethian, J.A. (California Univ., Berkeley, CA (USA). Dept. of Mathematics); Salem, J.B. (Thinking Machines Corp., Cambridge, MA (USA))

    1989-01-01

    The authors describe a new graphics environment for essentially real-time interactive visualization of computational fluid mechanics. The researcher may interactively examine fluid data on a graphics display using animated flow visualization diagnostics that mimic those in the experimental laboratory. These tools include display of moving color contours for scalar fields, smoke or dye injection of passive particles to identify coherent flow structures, and bubble wire tracers for velocity profiles, as well as three-dimensional interactive rotation and zoom and pan. The system is implemented on a data parallel supercomputer attached to a framebuffer. Since most fluid visualization techniques are highly parallel in nature, this allows rapid animation of fluid motion. The authors demonstrate our interactive graphics fluid flow system by analyzing data generated by numerical simulations of viscous, incompressible, laminar and turbulent flow over a backward-facing step and in a closed cavity. Input parameters are menu-driven, and images are updated at nine frames per second.

  4. Star-Shaped Fluid Flow Tool for Use in Making Differential Measurements

    Science.gov (United States)

    England, John Dwight (Inventor); Kelley, Anthony R. (Inventor); Cronise, Raymond J. (Inventor)

    2014-01-01

    A fluid flow tool's plate-like structure has a ring portion defining a flow hole, a support portion extending radially away from the ring portion and adapted to be coupled to conduit wall, and extensions extending radially away from the ring portion such that a periphery of the plate-like structure is defined by the extensions and trough regions between adjacent extensions. One or more ports formed in the ring portion are in fluid communication with the flow hole. A first manifold in the plate-like structure is in fluid communication with each port communicating with the flow hole. One or more ports are formed in the periphery of the plate-like structure. A second manifold in the plate-like structure is in fluid communication with each port formed in the periphery. The first and second manifolds extend through the plate-like structure to terminate and be accessible at the conduit wall.

  5. Rule-Based Multidisciplinary Tool for Unsteady Reacting Real-Fluid Flows Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Loci-STREAM is a CFD-based, multidisciplinary, high-fidelity design and analysis tool resulting from Phase I work whose objectives were: (a) to demonstrate the...

  6. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order......We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... to ensure the finiteness of the protocol state-spaces while still being able to verify interesting protocol properties. The translations for different kinds of communication media have been implemented and successfully tested, among others, on agreement protocols from WS-Business Activity....

  7. Lean production tools and decision latitude enable conditions for innovative learning in organizations: a multilevel analysis.

    Science.gov (United States)

    Fagerlind Ståhl, Anna-Carin; Gustavsson, Maria; Karlsson, Nadine; Johansson, Gun; Ekberg, Kerstin

    2015-03-01

    The effect of lean production on conditions for learning is debated. This study aimed to investigate how tools inspired by lean production (standardization, resource reduction, visual monitoring, housekeeping, value flow analysis) were associated with an innovative learning climate and with collective dispersion of ideas in organizations, and whether decision latitude contributed to these associations. A questionnaire was sent out to employees in public, private, production and service organizations (n = 4442). Multilevel linear regression analyses were used. Use of lean tools and decision latitude were positively associated with an innovative learning climate and collective dispersion of ideas. A low degree of decision latitude was a modifier in the association to collective dispersion of ideas. Lean tools can enable shared understanding and collective spreading of ideas, needed for the development of work processes, especially when decision latitude is low. Value flow analysis played a pivotal role in the associations.

  8. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  9. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  10. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  11. Navigating freely-available software tools for metabolomics analysis.

    Science.gov (United States)

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  12. Load flow analysis using decoupled fuzzy load flow under critical ...

    African Journals Online (AJOL)

    user

    of power system, reliable fuzzy load flow is developed to overcome the limitations of the ... of power mismatches are taken as two inputs for fuzzy logic controller. ..... Programming Based Load Flow Algorithm For Systems Containing Unified ...

  13. OpenFlow Deployment and Concept Analysis

    Directory of Open Access Journals (Sweden)

    Tomas Hegr

    2013-01-01

    Full Text Available Terms such as SDN and OpenFlow (OF are often used in the research and development of data networks. This paper deals with the analysis of the current state of OpenFlow protocol deployment options as it is the only real representative protocol that enables the implementation of Software Defined Networking outside an academic world. There is introduced an insight into the current state of the OpenFlow specification development at various levels is introduced. The possible limitations associated with this concept in conjunction with the latest version (1.3 of the specification published by ONF are also presented. In the conclusion there presented a demonstrative security application addressing the lack of IPv6 support in real network devices since most of today's switches and controllers support only OF v1.0.

  14. Analysis of groundwater flow beneath ice sheets

    Energy Technology Data Exchange (ETDEWEB)

    Boulton, G. S.; Zatsepin, S.; Maillot, B. [Univ. of Edinburgh (United Kingdom). Dept. of Geology and Geophysics

    2001-03-01

    The large-scale pattern of subglacial groundwater flow beneath European ice sheets was analysed in a previous report. It was based on a two-dimensional flowline model. In this report, the analysis is extended to three dimensions by exploring the interactions between groundwater and tunnel flow. A theory is developed which suggests that the large-scale geometry of the hydraulic system beneath an ice sheet is a coupled, self-organising system. In this system the pressure distribution along tunnels is a function of discharge derived from basal meltwater delivered to tunnels by groundwater flow, and the pressure along tunnels itself sets the base pressure which determines the geometry of catchments and flow towards the tunnel. The large-scale geometry of tunnel distribution is a product of the pattern of basal meltwater production and the transmissive properties of the bed. The tunnel discharge from the ice margin of the glacier, its seasonal fluctuation and the sedimentary characteristics of eskers are largely determined by the discharge of surface meltwater which penetrates to the bed in the terminal zone. The theory explains many of the characteristics of esker systems and can account for tunnel valleys. It is concluded that the large-scale hydraulic regime beneath ice sheets is largely a consequence of groundwater/tunnel flow interactions and that it is essential similar to non-glacial hydraulic regimes. Experimental data from an Icelandic glacier, which demonstrates measured relationships between subglacial tunnel flow and groundwater flow during the transition from summer to winter seasons for a modern glacier, and which support the general conclusions of the theory is summarised in an appendix.

  15. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  16. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  17. A Lexical Analysis Tool with Ambiguity Support

    CERN Document Server

    Quesada, Luis; Cortijo, Francisco J

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  18. Fully Parallel MHD Stability Analysis Tool

    Science.gov (United States)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  19. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development.

  20. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . An example of a prototype for a digital conceptual design tool with integrated real time structural analysis is presented and compared with a more common Building Information Modelling (BIM) approach. It is concluded that a digital conceptual design tool with embedded real time structural analysis could......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered...

  1. Tailings dam-break flow - Analysis of sediment transport

    Science.gov (United States)

    Aleixo, Rui; Altinakar, Mustafa

    2015-04-01

    -852 . Aleixo, R., Ozeren, Y., Altinakar, M. and Wren, D. (2014a) Velocity Measurements using Particle Tracking in Tailings dam Failure experiments, Proceedings of the 3rd IAHR-Europe conference, Porto, Portugal. Aleixo, R., Ozeren, Y., Altinakar, M. (2014b) Tailing dam-break analysis by means of a combined PIV-PTV tool, Proceedings of the River Flow Conference, Lausanne, Switzerland.

  2. CFD Analysis in Advance of the NASA Juncture Flow Experiment

    Science.gov (United States)

    Lee, H. C.; Pulliam, T. H.; Neuhart, D. H.; Kegerise, M. A.

    2017-01-01

    NASA through its Transformational Tools and Technologies Project (TTT) under the Advanced Air Vehicle Program, is supporting a substantial effort to investigate the formation and origin of separation bubbles found on wing-body juncture zones. The flow behavior in these regions is highly complex, difficult to measure experimentally, and challenging to model numerically. Multiple wing configurations were designed and evaluated using Computational Fluid Dynamics (CFD), and a series of wind tunnel risk reduction tests were performed to further down-select the candidates for the final experiment. This paper documents the CFD analysis done in conjunction with the 6 percent scale risk reduction experiment performed in NASA Langley's 14- by 22-Foot Subsonic Tunnel. The combined CFD and wind tunnel results ultimately helped the Juncture Flow committee select the wing configurations for the final experiment.

  3. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system......’s properties. Measured and controlled quantities in the system are related to variables through functional relations, which need only be stated as names, their explicit composition need not be described to the tool. The user enters a list of these relations that together describe the entirerity of the system....... The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  4. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  5. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  6. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...

  7. Healthcare BI: a tool for meaningful analysis.

    Science.gov (United States)

    Rohloff, Rose

    2011-05-01

    Implementing an effective business intelligence (BI) system requires organizationwide preparation and education to allow for meaningful analysis of information. Hospital executives should take steps to ensure that: Staff entering data are proficient in how the data are to be used for decision making, and integration is based on clean data from primary sources of entry. Managers have the business acumen required for effective data analysis. Decision makers understand how multidimensional BI offers new ways of analysis that represent significant improvements over historical approaches using static reporting.

  8. Flow analysis of the ophthalmic artery

    Energy Technology Data Exchange (ETDEWEB)

    Harada, Kuniaki; Hashimoto, Masato; Bandoh, Michio; Odawara, Yoshihiro; Kamagata, Masaki; Shirase, Ryuji [Sapporo Medical Univ. (Japan). Hospital

    2003-02-01

    The purpose of this study was to analyze the hemodynamics of ophthalmic artery flow using phase contrast MR angiography (PC-MRA). A total of 14 eyes from 10 normal volunteers and a patient with normal tension glaucoma (NTG) were analyzed. The optimal conditions were time repetition (TR)/echo time (TE)/flip angle (FA)/nex=40 ms/minimum/90 deg/2, field of view (FOV)=6 cm, matrix size=256 x 256. The resistive index (RI) and pulsatillity index (PI) values were significantly raised in the patient with NTG when compared to the control group. We therefore believe that PC-MRA may be a useful clinical tool for the assessment of the mechanism of NTG. (author)

  9. Performance analysis tool (PATO): Development and preliminary validation

    National Research Council Canada - National Science Library

    Fernando Martins; Filipe Clemente; Frutuoso Silva

    2017-01-01

    .... The Performance Analysis Tool (PATO) software was built with the aim to quickly codify relationships between players and built the adjacency matrices that can be used to test the network measures...

  10. Surface Operations Data Analysis and Adaptation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  11. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  12. Bayesian data analysis tools for atomic physics

    CERN Document Server

    Trassinelli, Martino

    2016-01-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested fit to calculate the different probability distrib...

  13. Network Tools for the Analysis of Proteomic Data.

    Science.gov (United States)

    Chisanga, David; Keerthikumar, Shivakumar; Mathivanan, Suresh; Chilamkurti, Naveen

    2017-01-01

    Recent advancements in high-throughput technologies such as mass spectrometry have led to an increase in the rate at which data is generated and accumulated. As a result, standard statistical methods no longer suffice as a way of analyzing such gigantic amounts of data. Network analysis, the evaluation of how nodes relate to one another, has over the years become an integral tool for analyzing high throughput proteomic data as they provide a structure that helps reduce the complexity of the underlying data.Computational tools, including pathway databases and network building tools, have therefore been developed to store, analyze, interpret, and learn from proteomics data. These tools enable the visualization of proteins as networks of signaling, regulatory, and biochemical interactions. In this chapter, we provide an overview of networks and network theory fundamentals for the analysis of proteomics data. We further provide an overview of interaction databases and network tools which are frequently used for analyzing proteomics data.

  14. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  15. Numerical flow analysis of hydro power stations

    Science.gov (United States)

    Ostermann, Lars; Seidel, Christian

    2017-07-01

    For the hydraulic engineering and design of hydro power stations and their hydraulic optimisation, mainly experimental studies of the physical submodel or of the full model at the hydraulics laboratory are carried out. Partially, the flow analysis is done by means of computational fluid dynamics based on 2D and 3D methods and is a useful supplement to experimental studies. For the optimisation of hydro power stations, fast numerical methods would be appropriate to study the influence of a wide field of optimisation parameters and flow states. Among the 2D methods, especially the methods based on the shallow water equations are suitable for this field of application, since a lot of experience verified by in-situ measurements exists because of the widely used application of this method for the problems in hydraulic engineering. As necessary, a 3D model may supplement subsequently the optimisation of the hydro power station. The quality of the results of the 2D method for the optimisation of hydro power plants is investigated by means of the results of the optimisation of the hydraulic dividing pier compared to the results of the 3D flow analysis.

  16. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  17. Interactive retinal blood flow analysis of the macular region.

    Science.gov (United States)

    Tian, Jing; Somfai, Gábor Márk; Campagnoli, Thalmon R; Smiddy, William E; Debuc, Delia Cabrera

    2016-03-01

    The study of retinal hemodynamics plays an important role to understand the onset and progression of diabetic retinopathy. In this work, we developed an interactive retinal analysis tool to quantitatively measure the blood flow velocity (BFV) and blood flow rate (BFR) in the macular region using the Retinal Function Imager (RFI). By employing a high definition stroboscopic fundus camera, the RFI device is able to assess retinal blood flow characteristics in vivo. However, the measurements of BFV using a user-guided vessel segmentation tool may induce significant inter-observer differences and BFR is not provided in the built-in software. In this work, we have developed an interactive tool to assess the retinal BFV and BFR in the macular region. Optical coherence tomography data was registered with the RFI image to locate the fovea accurately. The boundaries of the vessels were delineated on a motion contrast enhanced image and BFV was computed by maximizing the cross-correlation of pixel intensities in a ratio video. Furthermore, we were able to calculate the BFR in absolute values (μl/s). Experiments were conducted on 122 vessels from 5 healthy and 5 mild non-proliferative diabetic retinopathy (NPDR) subjects. The Pearson's correlation of the vessel diameter measurements between our method and manual labeling on 40 vessels was 0.984. The intraclass correlation (ICC) of BFV between our proposed method and built-in software was 0.924 and 0.830 for vessels from healthy and NPDR subjects, respectively. The coefficient of variation between repeated sessions was reduced significantly from 22.5% to 15.9% in our proposed method (p<0.001). Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Theme E: disabilities: analysis models and tools

    OpenAIRE

    Vigouroux, Nadine; Gorce, Philippe; Roby-Brami, Agnès; Rémi-Néris, Olivier

    2013-01-01

    International audience; This paper presents the topics and the activity of the theme E “disabilities: analysis models and tools” within the GDR STIC Santé. This group has organized a conference and a workshop during the period 2011–2012. The conference has focused on technologies for cognitive, sensory and motor impairments, assessment and use study of assistive technologies, user centered method design and the place of ethics in these research topics. The objective of “bodily integration of ...

  19. Flow status of three transboundary rivers in Northern Greece as a tool for hydro-diplomacy

    Science.gov (United States)

    Hatzigiannakis, Eyaggelos; Hatzispiroglou, Ioannis; Arampatzis, Georgios; Ilia, Andreas; Pantelakis, Dimitrios; Filintas, Agathos; Panagopoulos, Andreas

    2015-04-01

    The aim of this paper is to examine how the river flow monitoring consists a tool for hydro-diplomacy. Management of transboundary catchments and the demand of common water resources, often comprise the cause of conflicts and tension threatening the peaceful coexistence of nations. The Water Framework Directive 2000/60/EU sets a base for water management contributing to common approaches, common goals, common principles as well as providing new definitions and measures for Europe's water resources. In northern Greece the main renewable resources are "imported" (over 25% of its water reserves) and for this reason the implementation of continuous flow measurements throughout the year is necessary, even though difficult to achieve. This paper focuses on the three largest transboundary rivers in Northern Greece. Axios and Strymonas river flow across the region of Central Macedonia in Northern Greece. Axios flows from FYROM to Greece, and Strymonas from Bulgaria to Greece. Nestos river flows from Bulgaria to Greece. The Greek part is in the region of Eastern Macedonia and Thrace in Northern Greece. Significant productive agricultural areas around these rivers are irrigated from them so they are very important for the local society. Measurements of the river flow velocity and the flow depth have been made at bridges. The frequency of the measurements is roughly monthly, because it is expected a significant change in the depth flow and discharge. A series of continuously flow measure-ments were performed during 2013 and 2014 using flowmeters (Valeport and OTT type). The cross-section characteristics, the river flow velocity of segments and the mean water flow velocity and discharge total profile were measured and calculated re-spectively. Measurements are conducted in the framework of the national water resources monitoring network, which is realised in compliance to the Water Framework Directive under the supervision and coordination of the Hellenic Ministry for the

  20. Analysis of flow dynamics in right ventricular outflow tract.

    Science.gov (United States)

    Berdajs, Denis A; Mosbahi, Selim; Charbonnier, Dominique; Hullin, Roger; von Segesser, Ludwig K

    2015-07-01

    The mechanism behind early graft failure after right ventricular outflow tract (RVOT) reconstruction is not fully understood. Our aim was to establish a three-dimensional computational fluid dynamics (CFD) model of RVOT to investigate the hemodynamic conditions that may trigger the development of intimal hyperplasia and arteriosclerosis. Pressure, flow, and diameter at the RVOT, pulmonary artery (PA), bifurcation of the PA, and left and right PAs were measured in 10 normal pigs with a mean weight of 24.8 ± 0.78 kg. Data obtained from the experimental scenario were used for CFD simulation of pressure, flow, and shear stress profile from the RVOT to the left and right PAs. Using experimental data, a CFD model was obtained for 2.0 and 2.5-L/min pulsatile inflow profiles. In both velocity profiles, time and space averaged in the low-shear stress profile range from 0-6.0 Pa at the pulmonary trunk, its bifurcation, and at the openings of both PAs. These low-shear stress areas were accompanied to high-pressure regions 14.0-20.0 mm Hg (1866.2-2666 Pa). Flow analysis revealed a turbulent flow at the PA bifurcation and ostia of both PAs. Identified local low-shear stress, high pressure, and turbulent flow correspond to a well-defined trigger pattern for the development of intimal hyperplasia and arteriosclerosis. As such, this real-time three-dimensional CFD model may in the future serve as a tool for the planning of RVOT reconstruction, its analysis, and prediction of outcome. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  2. Match Analysis an undervalued coaching tool

    CERN Document Server

    Sacripanti, Attilio

    2010-01-01

    From a Biomechanical point of view, Judo competition is an intriguing complex nonlinear system, with many chaotic and fractals aspects, It is also the test bed in which all coaching capabilities and athlete's performances are evaluated and put to the test. Competition is the moment of truth of all conditioning time, preparation and technical work, before developed, and it is also the climax of the teaching point of view. Furthermore, it is the most important source of technical assessment. Studying it is essential to the coaches because they can obtain useful information for their coaching. Match Analysis could be seen as the master key in all situation sports (dual or team) like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a short summary of the most important methodological achievement in judo match analysis. It is also presented, at light of the last technological improvement, the first systematization toward new fiel...

  3. Analysis of liposomes using asymmetrical flow field-flow fractionation

    DEFF Research Database (Denmark)

    Kuntsche, Judith; Decker, Christiane; Fahr, Alfred

    2012-01-01

    Liposomes composed of dipalmitoylphosphatidylcholine and dipalmitoylphosphatidylglycerol were analyzed by asymmetrical flow field-flow fractionation coupled with multi-angle laser light scattering. In addition to evaluation of fractionation conditions (flow conditions, sample mass, carrier liquid......), radiolabeled drug-loaded liposomes were used to determine the liposome recovery and a potential loss of incorporated drug during fractionation. Neither sample concentration nor the cross-flow gradient distinctly affected the size results but at very low sample concentration (injected mass 5 μg) the fraction...... of larger vesicles was underestimated. Imbalance in the osmolality between the inner and outer aqueous phase resulted in liposome swelling after dilution in hypoosmotic carrier liquids. In contrast, liposome shrinking under hyperosmotic conditions was barely visible. The liposomes themselves eluted...

  4. Analysis of Secondary Flows in Centrifugal Impellers

    Directory of Open Access Journals (Sweden)

    Brun Klaus

    2005-01-01

    Full Text Available Secondary flows are undesirable in centrifugal compressors as they are a direct cause for flow (head losses, create nonuniform meridional flow profiles, potentially induce flow separation/stall, and contribute to impeller flow slip; that is, secondary flows negatively affect the compressor performance. A model based on the vorticity equation for a rotating system was developed to determine the streamwise vorticity from the normal and binormal vorticity components (which are known from the meridional flow profile. Using the streamwise vorticity results and the small shear-large disturbance flow method, the onset, direction, and magnitude of circulatory secondary flows in a shrouded centrifugal impeller can be predicted. This model is also used to estimate head losses due to secondary flows in a centrifugal flow impeller. The described method can be employed early in the design process to develop impeller flow shapes that intrinsically reduce secondary flows rather than using disruptive elements such as splitter vanes to accomplish this task.

  5. Serial concept maps: tools for concept analysis.

    Science.gov (United States)

    All, Anita C; Huycke, LaRae I

    2007-05-01

    Nursing theory challenges students to think abstractly and is often a difficult introduction to graduate study. Traditionally, concept analysis is useful in facilitating this abstract thinking. Concept maps are a way to visualize an individual's knowledge about a specific topic. Serial concept maps express the sequential evolution of a student's perceptions of a selected concept. Maps reveal individual differences in learning and perceptions, as well as progress in understanding the concept. Relationships are assessed and suggestions are made during serial mapping, which actively engages the students and faculty in dialogue that leads to increased understanding of the link between nursing theory and practice. Serial concept mapping lends itself well to both online and traditional classroom environments.

  6. NMR spectroscopy: a tool for conformational analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto, E-mail: rittner@iqm.unicamp.br [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Lab. de Fisico-Quimica Organica; Freitas, Matheus P. [Universidade Federal de Lavras (UFLA), MG (Brazil). Dept. de Qumica

    2011-07-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  7. SustainPro - A tool for systematic process analysis, generation and evaluation of sustainable design alternatives

    DEFF Research Database (Denmark)

    Carvalho, Ana; Matos, Henrique A.; Gani, Rafiqul

    2013-01-01

    Chemical processes are continuously facing challenges from the demands of the global market related to economics, environment and social issues. This paper presents the development of a software tool (SustainPro) and its application to chemical processes operating in batch or continuous modes...... the user through the necessary steps according to work-flow of the implemented methodology. At the end the design alternatives, are evaluated using environmental impact assessment tools and safety indices. The extended features of the methodology incorporate Life Cycle Assessment analysis and economic...

  8. Analysis of Secondary Flows in Centrifugal Impellers

    OpenAIRE

    2005-01-01

    Secondary flows are undesirable in centrifugal compressors as they are a direct cause for flow (head) losses, create nonuniform meridional flow profiles, potentially induce flow separation/stall, and contribute to impeller flow slip; that is, secondary flows negatively affect the compressor performance. A model based on the vorticity equation for a rotating system was developed to determine the streamwise vorticity from the normal and binormal vorticity components (which are known from the me...

  9. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  10. Tools for analysis of Dirac structures on banach spaces

    NARCIS (Netherlands)

    Iftime, Orest V.; Sandovici, Adrian; Golo, Goran

    2005-01-01

    Power-conserving and Dirac structures are known as an approach to mathematical modeling of physical engineering systems. In this paper connections between Dirac structures and well known tools from standard functional analysis are presented. The analysis can be seen as a possible starting framework

  11. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  12. Tool Failure Analysis in High Speed Milling of Titanium Alloys

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xiuxu; MEYER Kevin; HE Rui; YU Cindy; NI Jun

    2006-01-01

    In high speed milling of titanium alloys the high rate of tool failure is the main reason for its high manufacturing cost. In this study, fractured tools which were used in a titanium alloys 5-axis milling process have been observed both in the macro scale using a PG-1000 light microscope and in the micro scale using a Scanning Electron Microscope (SEM) respectively. These observations indicate that most of these tool fractures are the result of tool chipping. Further analysis of each chipping event has shown that beachmarks emanate from points on the cutting edge. This visual evidence indicates that the cutting edge is failing in fatigue due to cyclical mechanical and/or thermal stresses. Initial analyses explaining some of the outlying conditions for this phenomenon are discussed. Future analysis regarding determining the underlying causes of the fatigue phenomenon is then outlined.

  13. A study of grout flow pattern analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S. Y. [Savannah River National Lab., Aiken, SC (United States); Hyun, S. [Mercer Univ., Macon, GA (United States)

    2013-01-10

    A new disposal unit, designated as Salt Disposal Unit no. 6 (SDU6), is being designed for support of site accelerated closure goals and salt nuclear waste projections identified in the new Liquid Waste System plan. The unit is cylindrical disposal vault of 380 ft diameter and 43 ft in height, and it has about 30 million gallons of capacity. Primary objective was to develop the computational model and to perform the evaluations for the flow patterns of grout material in SDU6 as function of elevation of grout discharge port, and slurry rheology. A Bingham plastic model was basically used to represent the grout flow behavior. A two-phase modeling approach was taken to achieve the objective. This approach assumes that the air-grout interface determines the shape of the accumulation mound. The results of this study were used to develop the design guidelines for the discharge ports of the Saltstone feed materials in the SDU6 facility. The focusing areas of the modeling study are to estimate the domain size of the grout materials radially spread on the facility floor under the baseline modeling conditions, to perform the sensitivity analysis with respect to the baseline design and operating conditions such as elevation of discharge port, discharge pipe diameter, and grout properties, and to determine the changes in grout density as it is related to grout drop height. An axi-symmetric two-phase modeling method was used for computational efficiency. Based on the nominal design and operating conditions, a transient computational approach was taken to compute flow fields mainly driven by pumping inertia and natural gravity. Detailed solution methodology and analysis results are discussed here.

  14. A study of grout flow pattern analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S. Y. [Savannah River National Lab., Aiken, SC (United States); Hyun, S. [Mercer Univ., Macon, GA (United States)

    2013-01-10

    A new disposal unit, designated as Salt Disposal Unit no. 6 (SDU6), is being designed for support of site accelerated closure goals and salt nuclear waste projections identified in the new Liquid Waste System plan. The unit is cylindrical disposal vault of 380 ft diameter and 43 ft in height, and it has about 30 million gallons of capacity. Primary objective was to develop the computational model and to perform the evaluations for the flow patterns of grout material in SDU6 as function of elevation of grout discharge port, and slurry rheology. A Bingham plastic model was basically used to represent the grout flow behavior. A two-phase modeling approach was taken to achieve the objective. This approach assumes that the air-grout interface determines the shape of the accumulation mound. The results of this study were used to develop the design guidelines for the discharge ports of the Saltstone feed materials in the SDU6 facility. The focusing areas of the modeling study are to estimate the domain size of the grout materials radially spread on the facility floor under the baseline modeling conditions, to perform the sensitivity analysis with respect to the baseline design and operating conditions such as elevation of discharge port, discharge pipe diameter, and grout properties, and to determine the changes in grout density as it is related to grout drop height. An axi-symmetric two-phase modeling method was used for computational efficiency. Based on the nominal design and operating conditions, a transient computational approach was taken to compute flow fields mainly driven by pumping inertia and natural gravity. Detailed solution methodology and analysis results are discussed here.

  15. Single-cell analysis tools for drug discovery and development.

    Science.gov (United States)

    Heath, James R; Ribas, Antoni; Mischel, Paul S

    2016-03-01

    The genetic, functional or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development. Such heterogeneity hinders the design of accurate disease models and can confound the interpretation of biomarker levels and of patient responses to specific therapies. The complex nature of virtually all tissues has motivated the development of tools for single-cell genomic, transcriptomic and multiplex proteomic analyses. Here, we review these tools and assess their advantages and limitations. Emerging applications of single cell analysis tools in drug discovery and development, particularly in the field of oncology, are discussed.

  16. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  17. Cluster analysis of multiple planetary flow regimes

    Science.gov (United States)

    Mo, Kingtse; Ghil, Michael

    1988-01-01

    A modified cluster analysis method developed for the classification of quasi-stationary events into a few planetary flow regimes and for the examination of transitions between these regimes is described. The method was applied first to a simple deterministic model and then to a 500-mbar data set for Northern Hemisphere (NH), for which cluster analysis was carried out in the subspace of the first seven empirical orthogonal functions (EOFs). Stationary clusters were found in the low-frequency band of more than 10 days, while transient clusters were found in the band-pass frequency window between 2.5 and 6 days. In the low-frequency band, three pairs of clusters determined EOFs 1, 2, and 3, respectively; they exhibited well-known regional features, such as blocking, the Pacific/North American pattern, and wave trains. Both model and low-pass data exhibited strong bimodality.

  18. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  19. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  20. Mathematical model for analysis of recirculating vertical flow constructed wetlands.

    Science.gov (United States)

    Sklarz, Menachem Y; Gross, Amit; Soares, M Ines M; Yakirevich, Alexander

    2010-03-01

    The recirculating vertical flow constructed wetland (RVFCW) was developed for the treatment of domestic wastewater (DWW). In this system, DWW is applied to a vertical flow bed through which it trickles into a reservoir located beneath the bed. It is then recirculated back to the root zone of the bed. In this study, a compartmental model was developed to simulate the RVFCW. The model, which addresses transport and removal kinetics of total suspended solids, 5-day biological oxygen demand and nitrogen, was fitted to kinetical results obtained from pilot field setups and a local sensitivity analysis was performed on the model parameters and operational conditions. This analysis showed that after 5h of treatment water quality is affected more by stochastic events than by the model parameter values, emphasizing the stability of the RVFCW system to large variations in operational conditions. Effluent quality after 1h of treatment, when the sensitivity analysis showed the parameter impacts to be largest, was compared to model predictions. The removal rate was found to be dependent on the recirculation rate. The predictions correlated well with experimental observations, leading to the conclusion that the proposed model is a satisfactory tool for studying RVFCWs. Copyright 2009 Elsevier Ltd. All rights reserved.

  1. Image analysis techniques for the study of turbulent flows

    Directory of Open Access Journals (Sweden)

    Ferrari Simone

    2017-01-01

    Full Text Available In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively “low cost” techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  2. A Software Tool for Integrated Optical Design Analysis

    Science.gov (United States)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  3. Physics analysis tools for beauty physics in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Anastopoulos, C [Physics Department, Aristotle University Of Thessaloniki (Greece); Bouhova-Thacker, E; Catmore, J; Mora, L de [Department of Physics, Lancaster University (United Kingdom); Dallison, S [Particle Physics Department, CCLRC Rutherford Appleton Laboratory (United Kingdom); Derue, F [LPNHE, IN2P3 - CNRS - Universites Paris VI et Paris VII (France); Epp, B; Jussel, P [Institute for Astro- and Particle Physics, University of Innsbruck (Austria); Kaczmarska, A [Institute of Nuclear Physics, Polish Academy of Sciences (Poland); Radziewski, H v; Stahl, T [Department of Physics, University of Siegen (Germany); Reznicek, P [IPNP, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)], E-mail: pavel.reznicek@cern.ch

    2008-07-15

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

  4. [SIGAPS, a tool for the analysis of scientific publications].

    Science.gov (United States)

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production.

  5. Content Analysis of Survey Feedback Meetings: An Evaluation Tool

    Science.gov (United States)

    1975-05-01

    AD-AO10 210 CONTENT ANALYSIS OF SURVEY FEEDBACK MEETINGS: AN EVALUATION TOOL Patricia A. Pecorella Michigan University Prepared for: Office of Naval...RECIPIENIT’S CATALOG NUMSEA 4, TITLE (#wtd$4bIII*) 5.&TYJ F REPORT 6PEFlIOg COVERlEO Content Analysis of Survey Feedback Meetings: Technical Report An...Ratings Coder Re1liability Evaluation Supervisory Leadership Consultant Roles Problem-Identification Survey Feedback Content Analysis Problem-Solving

  6. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  7. Software Construction and Analysis Tools for Future Space Missions

    Science.gov (United States)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  8. Tools for T-RFLP data analysis using Excel.

    Science.gov (United States)

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie

    2014-11-08

    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  9. Flow boiling in microgap channels experiment, visualization and analysis

    CERN Document Server

    Alam, Tamanna; Jin, Li-Wen

    2013-01-01

    Flow Boiling in Microgap Channels: Experiment, Visualization and Analysis presents an up-to-date summary of the details of the confined to unconfined flow boiling transition criteria, flow boiling heat transfer and pressure drop characteristics, instability characteristics, two phase flow pattern and flow regime map and the parametric study of microgap dimension. Advantages of flow boiling in microgaps over microchannels are also highlighted. The objective of this Brief is to obtain a better fundamental understanding of the flow boiling processes, compare the performance between microgap and c

  10. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  11. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    Science.gov (United States)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  12. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  13. Implementation of erythroid lineage analysis by flow cytometry in diagnostic models for myelodysplastic syndromes

    Science.gov (United States)

    Cremers, Eline M.P.; Westers, Theresia M.; Alhan, Canan; Cali, Claudia; Visser-Wisselaar, Heleen A.; Chitu, Dana A.; van der Velden, Vincent H.J.; te Marvelde, Jeroen G.; Klein, Saskia K.; Muus, Petra; Vellenga, Edo; de Greef, Georgina E.; Legdeur, Marie-Cecile C.J.C.; Wijermans, Pierre W.; Stevens-Kroef, Marian J.P.L.; da Silva-Coelho, Pedro; Jansen, Joop H.; Ossenkoppele, Gert J.; van de Loosdrecht, Arjan A.

    2017-01-01

    Flow cytometric analysis is a recommended tool in the diagnosis of myelodysplastic syndromes. Current flow cytometric approaches evaluate the (im)mature myelo-/monocytic lineage with a median sensitivity and specificity of ~71% and ~93%, respectively. We hypothesized that the addition of erythroid lineage analysis could increase the sensitivity of flow cytometry. Hereto, we validated the analysis of erythroid lineage parameters recommended by the International/European LeukemiaNet Working Group for Flow Cytometry in Myelodysplastic Syndromes, and incorporated this evaluation in currently applied flow cytometric models. One hundred and sixty-seven bone marrow aspirates were analyzed; 106 patients with myelodysplastic syndromes, and 61 cytopenic controls. There was a strong correlation between presence of erythroid aberrancies assessed by flow cytometry and the diagnosis of myelodysplastic syndromes when validating the previously described erythroid evaluation. Furthermore, addition of erythroid aberrancies to two different flow cytometric models led to an increased sensitivity in detecting myelodysplastic syndromes: from 74% to 86% for the addition to the diagnostic score designed by Ogata and colleagues, and from 69% to 80% for the addition to the integrated flow cytometric score for myelodysplastic syndromes, designed by our group. In both models the specificity was unaffected. The high sensitivity and specificity of flow cytometry in the detection of myelodysplastic syndromes illustrates the important value of flow cytometry in a standardized diagnostic approach. The trial is registered at www.trialregister.nl as NTR1825; EudraCT n.: 2008-002195-10 PMID:27658438

  14. Power flow analysis for DC voltage droop controlled DC microgrids

    DEFF Research Database (Denmark)

    Li, Chendan; Chaudhary, Sanjay; Dragicevic, Tomislav

    2014-01-01

    This paper proposes a new algorithm for power flow analysis in droop controlled DC microgrids. By considering the droop control in the power flow analysis for the DC microgrid, when compared with traditional methods, more accurate analysis results can be obtained. The algorithm verification...... is carried out by comparing the calculation results with detailed time domain simulation results. With the droop parameters as variables in the power flow analysis, their effects on power sharing and secondary voltage regulation can now be analytically studied, and specialized optimization in the upper level...... control can also be made accordingly. Case studies on power sharing and secondary voltage regulation are carried out using proposed power flow analysis....

  15. A Geographic and Functional Network Flow Analysis Tool

    Science.gov (United States)

    2014-06-01

    button is clicked , the plugin starts the loop illustrated in Figure 7. Simply, we turn the QGIS layer attributes into GAMS input, run the selected model...actual network traffic demand. Data content providers like Akamai (Akamai 2014) and Google (Google 2014) place large server farms and data caches near...are now able to run these tests with one mouse click . This proves to be a much easier solution than the tedious task of manually generating input

  16. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    an efficient and reliable manner – with minimal energy loss cost. ... recent renewed global interest in DG since the conventional ... 1Department of Electrical & Electronic Engineering, Universiti Putra Malaysia, ... in increase in the power frequency beyond the preset upper ...... New Jersey, USA, Pearson Education, Inc.,.

  17. The Adversarial Route Analysis Tool: A Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  18. Advanced Statistical and Data Analysis Tools for Astrophysics

    Science.gov (United States)

    Kashyap, V.; Scargle, Jeffrey D. (Technical Monitor)

    2001-01-01

    The goal of the project is to obtain, derive, and develop statistical and data analysis tools that would be of use in the analyses of high-resolution, high-sensitivity data that are becoming available with new instruments. This is envisioned as a cross-disciplinary effort with a number of collaborators.

  19. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  20. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  1. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  2. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects...... and designers....

  3. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  4. Tools for analysis of Dirac structures on Hilbert spaces

    NARCIS (Netherlands)

    Golo, G.; Iftime, O.V.; Zwart, Heiko J.; van der Schaft, Arjan

    2004-01-01

    In this paper tools for the analysis of Dirac structures on Hilbert spaces are developed. Some properties are pointed out and two natural representations of Dirac structures on Hilbert spaces are presented. The theory is illustrated on the example of the ideal transmission line.

  5. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  6. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...

  7. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  8. Quantitative analysis of uncertainty from pebble flow in HTR

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hao, E-mail: haochen.heu@163.com [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); Institute of Nuclear and New Energy Technology (INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China); Fu, Li; Jiong, Guo; Ximing, Sun; Lidong, Wang [Institute of Nuclear and New Energy Technology (INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China)

    2015-12-15

    Highlights: • An uncertainty and sensitivity analysis model for pebble flow has been built. • Experiment and random walk theory are used to identify uncertainty of pebble flow. • Effects of pebble flow to the core parameters are identified by sensitivity analysis. • Uncertainty of core parameters due to pebble flow is quantified for the first time. - Abstract: In pebble bed HTR, along the deterministic average flow lines, randomness exists in the flow of pebbles, which is not possible to simulate with the current reactor design codes for HTR, such as VSOP, due to the limitation of current computer capability. In order to study how the randomness of pebble flow will affect the key parameters in HTR, a new pebble flow model was set up, which has been successfully transplanted into the VSOP code. In the new pebble flow model, mixing coefficients were introduced into the fixed flow line to simulate the randomness of pebble flow. Numerical simulation and pebble flow experiments were facilitated to determine the mixing coefficients. Sensitivity analysis was conducted to achieve the conclusion that the key parameters of pebble bed HTR are not sensitive to the randomness in pebble flow. The uncertainty of maximum power density and power distribution caused by the randomness in pebble flow is very small, especially for the “multi-pass” scheme of fuel circulation adopted in the pebble bed HTR.

  9. Technical discussions II - Flow cytometric analysis

    NARCIS (Netherlands)

    Cunningham, A; Cid, A; Buma, AGJ

    1996-01-01

    In this paper the potencial of flow cytometry as applied to the aquatic life sciences is discussed. The use of flow cytometry for studying the ecotoxicology of phytoplankton was introduced. On the other hand, the new flow cytometer EUROPA was presented. This is a multilaser machine which has been sp

  10. Technical discussions II - Flow cytometric analysis

    NARCIS (Netherlands)

    Cunningham, A; Cid, A; Buma, AGJ

    1996-01-01

    In this paper the potencial of flow cytometry as applied to the aquatic life sciences is discussed. The use of flow cytometry for studying the ecotoxicology of phytoplankton was introduced. On the other hand, the new flow cytometer EUROPA was presented. This is a multilaser machine which has been sp

  11. Theoretical analysis of tsunami generation by pyroclastic flows

    Science.gov (United States)

    Watts, P.; Waythomas, C.F.

    2003-01-01

    Pyroclastic flows are a common product of explosive volcanism and have the potential to initiate tsunamis whenever thick, dense flows encounter bodies of water. We evaluate the process of tsunami generation by pyroclastic flow by decomposing the pyroclastic flow into two components, the dense underflow portion, which we term the pyroclastic debris flow, and the plume, which includes the surge and coignimbrite ash cloud parts of the flow. We consider five possible wave generation mechanisms. These mechanisms consist of steam explosion, pyroclastic debris flow, plume pressure, plume shear, and pressure impulse wave generation. Our theoretical analysis of tsunami generation by these mechanisms provides an estimate of tsunami features such as a characteristic wave amplitude and wavelength. We find that in most situations, tsunami generation is dominated by the pyroclastic debris flow component of a pyroclastic flow. This work presents information sufficient to construct tsunami sources for an arbitrary pyroclastic flow interacting with most bodies of water. Copyright 2003 by the American Geophysical Union.

  12. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    Science.gov (United States)

    Clough, D.; Fletcher, S.; Longstaff, A. P.; Willoughby, P.

    2012-05-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  13. Discovery and New Frontiers Project Budget Analysis Tool

    Science.gov (United States)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  14. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  15. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  16. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  17. OPR1000 RCP Flow Coastdown Analysis using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Hyuk; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Korean nuclear industry developed a thermal-hydraulic analysis code for the safety analysis of PWRs, named SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). Current loss of flow transient analysis of OPR1000 uses COAST code to calculate transient RCS(Reactor Coolant System) flow. The COAST code calculates RCS loop flow using pump performance curves and RCP(Reactor Coolant Pump) inertia. In this paper, SPACE code is used to reproduce RCS flowrates calculated by COAST code. The loss of flow transient is transient initiated by reduction of forced reactor coolant circulation. Typical loss of flow transients are complete loss of flow(CLOF) and locked rotor(LR). OPR1000 RCP flow coastdown analysis was performed using SPACE using simplified nodalization. Complete loss of flow(4 RCP trip) was analyzed. The results show good agreement with those from COAST code, which is CE code for calculating RCS flow during loss of flow transients. Through this study, we confirmed that SPACE code can be used instead of COAST code for RCP flow coastdown analysis.

  18. A Calculus for Control Flow Analysis of Security Protocols

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Nielson, Hanne Riis; Nielson, Flemming

    2004-01-01

    analysis methodology. We pursue an analysis methodology based on control flow analysis in flow logic style and we have previously shown its ability to analyse a variety of security protocols. This paper develops a calculus, LysaNS that allows for much greater control and clarity in the description...

  19. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  20. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  1. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  2. Linguistics and cognitive linguistics as tools of pedagogical discourse analysis

    Directory of Open Access Journals (Sweden)

    Kurovskaya Yulia G.

    2016-01-01

    Full Text Available The article discusses the use of linguistics and cognitive linguistics as tools of pedagogical discourse analysis, thus establishing a new branch of pedagogy called pedagogical semiology that is concerned with students’ acquisition of culture encoded in symbols and the way students’ sign consciousness formed in the context of learning affects their world cognition and interpersonal communication. The article introduces a set of tools that would enable the teacher to organize the educational process in compliance with the rules of language as a sign system applied to the context of pedagogy and with the formation of younger generation’s language picture of the world.

  3. Analyzing wet weather flow management using state of the art tools.

    Science.gov (United States)

    Parker, Denny S; Merlo, Rion P; Jimenez, Jose A; Wahlberg, Eric J

    2008-01-01

    Optimal secondary clarifier performance is crucial to meet treatment requirements, especially when treating peak wet weather flows (PWWFs), to prevent high effluent suspended solids (ESS) concentrations and elevated sludge blankets. A state-of-the-art computational fluid dynamic (CFD) model was successfully used as a design and diagnostic tool to optimize performance for municipal wastewater treatment plants subject to significant PWWFs. Two case studies are presented. For Case Study 1, the model was used to determine the number of secondary clarifiers that will be necessary to treat future PWWF conditions for a plant under design. For Case Study 2, the model was used to identify modifications that are currently being made to increase the clarifier capacity for handling PWWF. (c) IWA Publishing 2008.

  4. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  5. SIGNAL FLOW GRAPH ANALYSIS OF MECHANICAL ENGINEERING SYSTEMS

    Science.gov (United States)

    CONTROL SYSTEMS, *MECHANICS, *STRUCTURES, *THERMODYNAMICS, *TOPOLOGY, BEAMS(ELECTROMAGNETIC), BEAMS(STRUCTURAL), GAS FLOW, GEARS, HEAT EXCHANGERS, MATHEMATICAL ANALYSIS, MATHEMATICS, MECHANICAL ENGINEERING , RAMJET ENGINES.

  6. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  7. CRAB: the CMS distributed analysis tool development and design

    Energy Technology Data Exchange (ETDEWEB)

    Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)

    2008-03-15

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  8. CRAB: the CMS distributed analysis tool development and design

    CERN Document Server

    Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L

    2008-01-01

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  9. FLOW CYTOMETRY AS A MODERN ANALYTICAL TOOL IN BIOLOGY AND MEDICINE

    Directory of Open Access Journals (Sweden)

    S. V. Khaidukov

    2007-01-01

    Full Text Available Abstract. Flow cytometry is considered as a modern technology for fast measurements of cellular characteristics, their organelles, and processes occurring within them. It is regarded as an efficient solution in many important areas of cell biology, immunology and cellular engineering. Present article bears on main developments in flow cytometry and their applications in medical and biological practice. Usage of modern achievements in fluorescent dyes, progress in laser and computer technologies, as well as potent software, resulted in wide application of this technique in medical practice. Accordingly, usage of monoclonal antibodies conjugated to different fluorochromes has led to elaboration of multiparametric analysis and did sufficiently simplify specialized works aimed for diagnostics of various immune disorders. The new directions in flow cytometry, e.g., flow cytoenzymology, provide wide opportunities for detailed identification of damaged or altered cells, and taking adequate decisions in treatment of detected pathological changes. The authors suggest that this article could initiate a series of publications concerning usage of this technology and its modern applications in broad laboratory practice.

  10. Computational analysis of high-throughput flow cytometry data

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  11. Accuracy Analysis and Calibration of Gantry Hybrid Machine Tool

    Institute of Scientific and Technical Information of China (English)

    唐晓强; 李铁民; 尹文生; 汪劲松

    2003-01-01

    The kinematic accuracy is a key factor in the design of parallel or hybrid machine tools. This analysis improved the accuracy of a 4-DOF (degree of freedom) gantry hybrid machine tool based on a 3-DOF planar parallel manipulator by compensating for various positioning errors. The machine tool architecture was described with the inverse kinematic solution. The control parameter error model was used to analyze the accuracy of the 3-DOF planar parallel manipulator and to develop a kinematic calibration method. The experimental results prove that the calibration method reduces the cutter nose errors from ±0.50 mm to ±0.03 mm for a horizontal movement of 600 mm by compensating for errors in the slider home position, the guide way distance and the extensible strut home position. The calibration method will be useful for similar types of parallel kinematic machines.

  12. Gradual Variation Analysis for Groundwater Flow

    CERN Document Server

    Chen, Li

    2010-01-01

    Groundwater flow in Washington DC greatly influences the surface water quality in urban areas. The current methods of flow estimation, based on Darcy's Law and the groundwater flow equation, can be described by the diffusion equation (the transient flow) and the Laplace equation (the steady-state flow). The Laplace equation is a simplification of the diffusion equation under the condition that the aquifer has a recharging boundary. The practical way of calculation is to use numerical methods to solve these equations. The most popular system is called MODFLOW, which was developed by USGS. MODFLOW is based on the finite-difference method in rectangular Cartesian coordinates. MODFLOW can be viewed as a "quasi 3D" simulation since it only deals with the vertical average (no z-direction derivative). Flow calculations between the 2D horizontal layers use the concept of leakage. In this project, we have established a mathematical model based on gradually varied functions for groundwater data volume reconstruction. T...

  13. Toward compressed DMD: spectral analysis of fluid flows using sub-Nyquist-rate PIV data

    CERN Document Server

    Tu, Jonathan H; Kutz, J Nathan; Shang, Jessica K

    2014-01-01

    Dynamic mode decomposition (DMD) is a powerful and increasingly popular tool for performing spectral analysis of fluid flows. However, it requires data that satisfy the Nyquist-Shannon sampling criterion. In many fluid flow experiments, such data are impossible to capture. We propose a new approach that combines ideas from DMD and compressed sensing. Given a vector-valued signal, we take measurements randomly in time (at a sub-Nyquist rate) and project the data onto a low-dimensional subspace. We then use compressed sensing to identify the dominant frequencies in the signal and their corresponding modes. We demonstrate this method using two examples, analyzing both an artificially constructed test dataset and particle image velocimetry data collected from the flow past a cylinder. In each case, our method correctly identifies the characteristic frequencies and oscillatory modes dominating the signal, proving the proposed method to be a capable tool for spectral analysis using sub-Nyquist-rate sampling.

  14. Towards an integrated petrophysical tool for multiphase flow properties of core samples

    Energy Technology Data Exchange (ETDEWEB)

    Lenormand, R. [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    This paper describes the first use of an Integrated Petrophysical Tool (IPT) on reservoir rock samples. The IPT simultaneously measures the following petrophysical properties: (1) Complete capillary pressure cycle: primary drainage, spontaneous and forced imbibitions, secondary drainage (the cycle leads to the wettability of the core by using the USBM index); End-points and parts of the relative permeability curves; Formation factor and resistivity index. The IPT is based on the steady-state injection of one fluid through the sample placed in a Hassler cell. The experiment leading to the whole Pc cycle on two reservoir sandstones consists of about 30 steps at various oil or water flow rates. It takes about four weeks and is operated at room conditions. Relative permeabilities are in line with standard steady-state measurements. Capillary pressures are in accordance with standard centrifuge measurements. There is no comparison for the resistivity index, but the results are in agreement with literature data. However, the accurate determination of saturation remains the main difficulty and some improvements are proposed. In conclusion, the Integrated Petrophysical Tool is as accurate as standard methods and has the advantage of providing the various parameters on the same sample and during a single experiment. The FIT is easy to use and can be automated. In addition, it can be operated in reservoir conditions.

  15. Analysis od Ducted Propellers in Steady Flow

    Science.gov (United States)

    1986-02-01

    P - ..-- ~ - ....- . *.* .*-.... *% * . N 1-.- TABLE OF CONTENTS 1. BACKGROUND. 1 2. VISCOUS EFFECTS IN TIP GAP FLOWS . 4 3. LIFTING LINE...the development of PSF and BPSF, for which the reader is referred to the beforementioned publications. 6 *-3- 2. VISCOUS EFFECTS IN TIP GAP FLOWS One...these considerations still apply. The existance of such a boundary layer is certainly due to viscous effects, but the local gap flow will be primarily

  16. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  17. Acetylene Flow Rate as a Crucial Parameter of Vacuum Carburizing Process of Modern Tool Steels

    Directory of Open Access Journals (Sweden)

    Rokicki P.

    2016-12-01

    Full Text Available Carburizing is one of the most popular and wide used thermo-chemical treatment methods of surface modification of tool steels. It is a process based on carbon diffusive enrichment of the surface material and is applied for elements that are supposed to present higher hardness and wear resistance sustaining core ductility. Typical elements submitted to carburizing process are gears, shafts, pins and bearing elements. In the last years, more and more popular, especially in highly advanced treatment procedures used in the aerospace industry is vacuum carburizing. It is a process based on chemical treatment of the surface in lower pressure, providing much higher uniformity of carburized layer, lower process cost and much lesser negative impact on environment to compare with conventional carburizing methods, as for example gas carburizing in Endo atmosphere. Unfortunately, aerospace industry requires much more detailed description of the phenomena linked to this process method and the literature background shows lack of tests that could confirm fulfilment of all needed requirements and to understand the process itself in much deeper meaning. In the presented paper, authors focused their research on acetylene flow impact on carburized layer characteristic. This is one of the most crucial parameters concerning homogeneity and uniformity of carburized layer properties. That is why, specific process methodology have been planned based on different acetylene flow values, and the surface layer of the steel gears have been investigated in meaning to impact on any possible change in potential properties of the final product.

  18. Modeling of the flow stress for AISI H13 Tool Steel during Hard Machining Processes

    Science.gov (United States)

    Umbrello, Domenico; Rizzuti, Stefania; Outeiro, José C.; Shivpuri, Rajiv

    2007-04-01

    In general, the flow stress models used in computer simulation of machining processes are a function of effective strain, effective strain rate and temperature developed during the cutting process. However, these models do not adequately describe the material behavior in hard machining, where a range of material hardness between 45 and 60 HRC are used. Thus, depending on the specific material hardness different material models must be used in modeling the cutting process. This paper describes the development of a hardness-based flow stress and fracture models for the AISI H13 tool steel, which can be applied for range of material hardness mentioned above. These models were implemented in a non-isothermal viscoplastic numerical model to simulate the machining process for AISI H13 with various hardness values and applying different cutting regime parameters. Predicted results are validated by comparing them with experimental results found in the literature. They are found to predict reasonably well the cutting forces as well as the change in chip morphology from continuous to segmented chip as the material hardness change.

  19. Numerical analysis of cavitation within slanted axial-flow pump

    Institute of Scientific and Technical Information of China (English)

    张睿; 陈红勋

    2013-01-01

    In this paper, the cavitating flow within a slanted axial-flow pump is numerically researched. The hydraulic and cavitation performance of the slanted axial-flow pump under different operation conditions are estimated. Compared with the experimental hydraulic performance curves, the numerical results show that the filter-based model is better than the standard k-e model to predict the parameters of hydraulic performance. In cavitation simulation, compared with the experimental results, the proposed numerical method has good predicting ability. Under different cavitation conditions, the internal cavitating flow fields within slanted axial-flow pump are investigated. Compared with flow visualization results, the major internal flow features can be effectively grasped. In order to explore the origin of the cavitation performance breakdown, the Boundary Vorticity Flux (BVF) is introduced to diagnose the cavitating flow fields. The analysis results indicate that the cavitation performance drop is relevant to the instability of cavitating flow on the blade suction surface.

  20. Effects of momentum conservation on the analysis of anisotropic flow

    Energy Technology Data Exchange (ETDEWEB)

    Borghini, N.; Dinh, P.M.; Ollitrault, J.-Y.; Poskanzer, A.M.; Voloshin, S.A.

    2002-02-05

    We present a general method for taking into account correlations due to momentum conservation in the analysis of anisotropic flow. Momentum conservation mostly affects the first harmonic in azimuthal distributions, i.e., directed flow. It also modifies higher harmonics, for instance elliptic flow, when they are measured with respect to a first harmonic event plane such as one determined with the standard transverse momentum method. Our method is illustrated by application to NA49 data on pion directed flow.

  1. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    Science.gov (United States)

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-02

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. AstroStat - A VO Tool for Statistical Analysis

    CERN Document Server

    Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin

    2015-01-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...

  3. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  4. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  5. TMVA - Tool-kit for Multivariate Data Analysis in ROOT

    Energy Technology Data Exchange (ETDEWEB)

    Therhaag, Jan; Von Toerne, Eckhard [Univ. Bonn, Physikalisches Institut, Nussallee 12, 53115 Bonn (Germany); Hoecker, Andreas; Speckmayer, Peter [European Organization for Nuclear Research - CERN, CH-1211 Geneve 23 (Switzerland); Stelzer, Joerg [Deutsches Elektronen-Synchrotron - DESY, Platanenallee 6, D-15738 Zeuthen (Germany); Voss, Helge [Max-Planck-Institut fuer Kernphysik - MPI, Postfach 10 39 80, Saupfercheckweg 1, DE-69117 Heidelberg (Germany)

    2010-07-01

    Given the ever-increasing complexity of modern HEP data analysis, multivariate analysis techniques have proven an indispensable tool in extracting the most valuable information from the data. TMVA, the Tool-kit for Multivariate Data Analysis, provides a large variety of advanced multivariate analysis techniques for both signal/background classification and regression problems. In TMVA, all methods are embedded in a user-friendly framework capable of handling the pre-processing of the data as well as the evaluation of the results, thus allowing for a simple use of even the most sophisticated multivariate techniques. Convenient assessment and comparison of different analysis techniques enable the user to choose the most efficient approach for any particular data analysis task. TMVA is an integral part of the ROOT data analysis framework and is widely-used in the LHC experiments. In this talk I will review recent developments in TMVA, discuss typical use-cases in HEP and present the performance of our most important multivariate techniques on example data by comparing it to theoretical performance limits. (authors)

  6. Parametric and experimental analysis using a power flow approach

    Science.gov (United States)

    Cuschieri, J. M.

    1990-01-01

    A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.

  7. Empirical analysis of heterogeneous traffic flow

    NARCIS (Netherlands)

    Ambarwati, L.; Pel, A.J.; Verhaeghe, R.J.; Van Arem, B.

    2013-01-01

    Traffic flow in many developing countries is strongly mixed comprising vehicle types, such as motorcycles, cars, (mini) buses, and trucks; furthermore, traffic flow typically exhibits free inter-lane exchanges. This phenomenon causes a complex vehicle interaction, rendering most existing traffic flo

  8. ANALYSIS OF TRANSONIC FLOW PAST CUSPED AIRFOILS

    Directory of Open Access Journals (Sweden)

    Jiří Stodůlka

    2015-06-01

    Full Text Available Transonic flow past two cusped airfoils is numerically solved and achieved results are analyzed by means of flow behavior and oblique shocks formation.Regions around sharp trailing edges are studied in detail and parameters of shock waves are solved and compared using classical shock polar approach and verified by reduction parameters for symmetric configurations.

  9. Effective methods for cash flow analysis.

    Science.gov (United States)

    Sylvestre, J; Urbancic, F R

    1994-07-01

    This article discusses techniques that healthcare financial managers can use to interpret and evaluate information from the statement of cash flows for more effective financial decision-making. The use of these techniques as a basis for systematically planning and controlling cash flows has the potential to benefit all healthcare organizations.

  10. International Trade Modelling Using Open Flow Networks: A Flow-Distance Based Analysis.

    Science.gov (United States)

    Shen, Bin; Zhang, Jiang; Li, Yixiao; Zheng, Qiuhua; Li, Xingsen

    2015-01-01

    This paper models and analyzes international trade flows using open flow networks (OFNs) with the approaches of flow distances, which provide a novel perspective and effective tools for the study of international trade. We discuss the establishment of OFNs of international trade from two coupled viewpoints: the viewpoint of trading commodity flow and that of money flow. Based on the novel model with flow distance approaches, meaningful insights are gained. First, by introducing the concepts of trade trophic levels and niches, countries' roles and positions in the global supply chains (or value-added chains) can be evaluated quantitatively. We find that the distributions of trading "trophic levels" have the similar clustering pattern for different types of commodities, and summarize some regularities between money flow and commodity flow viewpoints. Second, we find that active and competitive countries trade a wide spectrum of products, while inactive and underdeveloped countries trade a limited variety of products. Besides, some abnormal countries import many types of goods, which the vast majority of countries do not need to import. Third, harmonic node centrality is proposed and we find the phenomenon of centrality stratification. All the results illustrate the usefulness of the model of OFNs with its network approaches for investigating international trade flows.

  11. Analysis of ETMS Data Quality for Traffic Flow Management Decisions

    Science.gov (United States)

    Chatterji, Gano B.; Sridhar, Banavar; Kim, Douglas

    2003-01-01

    The data needed for air traffic flow management decision support tools is provided by the Enhanced Traffic Management System (ETMS). This includes both the tools that are in current use and the ones being developed for future deployment. Since the quality of decision support provided by all these tools will be influenced by the quality of the input ETMS data, an assessment of ETMS data quality is needed. Motivated by this desire, ETMS data quality is examined in this paper in terms of the unavailability of flight plans, deviation from the filed flight plans, departure delays, altitude errors and track data drops. Although many of these data quality issues are not new, little is known about their extent. A goal of this paper is to document the magnitude of data quality issues supported by numerical analysis of ETMS data. Guided by this goal, ETMS data for a 24-hour period were processed to determine the number of aircraft with missing flight plan messages at any given instant of time. Results are presented for aircraft above 18,000 feet altitude and also at all altitudes. Since deviation from filed flight plan is also a major cause of trajectory-modeling errors, statistics of deviations are presented. Errors in proposed departure times and ETMS-generated vertical profiles are also shown. A method for conditioning the vertical profiles for improving demand prediction accuracy is described. Graphs of actual sector counts obtained using these vertical profiles are compared with those obtained using the Host data for sectors in the Fort Worth Center to demonstrate the benefit of preprocessing. Finally, results are presented to quantify the extent of data drops. A method for propagating track positions during ETMS data drops is also described.

  12. Analysis of ETMS Data Quality for Traffic Flow Management Decisions

    Science.gov (United States)

    Chatterji, Gano B.; Sridhar, Banavar; Kim, Douglas

    2003-01-01

    The data needed for air traffic flow management decision support tools is provided by the Enhanced Traffic Management System (ETMS). This includes both the tools that are in current use and the ones being developed for future deployment. Since the quality of decision support provided by all these tools will be influenced by the quality of the input ETMS data, an assessment of ETMS data quality is needed. Motivated by this desire, ETMS data quality is examined in this paper in terms of the unavailability of flight plans, deviation from the filed flight plans, departure delays, altitude errors and track data drops. Although many of these data quality issues are not new, little is known about their extent. A goal of this paper is to document the magnitude of data quality issues supported by numerical analysis of ETMS data. Guided by this goal, ETMS data for a 24-hour period were processed to determine the number of aircraft with missing flight plan messages at any given instant of time. Results are presented for aircraft above 18,000 feet altitude and also at all altitudes. Since deviation from filed flight plan is also a major cause of trajectory-modeling errors, statistics of deviations are presented. Errors in proposed departure times and ETMS-generated vertical profiles are also shown. A method for conditioning the vertical profiles for improving demand prediction accuracy is described. Graphs of actual sector counts obtained using these vertical profiles are compared with those obtained using the Host data for sectors in the Fort Worth Center to demonstrate the benefit of preprocessing. Finally, results are presented to quantify the extent of data drops. A method for propagating track positions during ETMS data drops is also described.

  13. Multiple correspondence analysis as a tool for analysis of large ...

    African Journals Online (AJOL)

    Phone Number: +270739720957. Fax Number: .... dissimilarity between the frequencies in each cell of a contingency ..... 8:321. 14. WHO, Malaria Rapid Diagnostic Test Performance, ... respondence Analysis 1984: Academic Press. 18. Gifi, A.

  14. Physics Analysis Tools for the CMS experiment at LHC

    CERN Document Server

    Fabozzi, Francesco; Hegner, Benedikt; Lista, Luca

    2008-01-01

    The CMS experiment is expected to start data taking during 2008, and large data samples, of the Peta-bytes scale, will be produced each year. The CMS Physics Tools package provides the CMS physicist with a powerful and flexible software layer for analysis of these huge datasets that is well integrated in the CMS experiment software. A core part of this package is the Candidate Model providing a coherent interface to different types of data. Standard tasks such as combinatorial analyses, generic cuts, MC truth matching and constrained fitting are supported. Advanced template techniques enable the user to add missing features easily. We explain the underlying model, certain details of the implementation and present some use cases showing how the tools are currently used in generator and full simulation studies as preparation for analysis of real data.

  15. SmashCommunity: A metagenomic annotation and analysis tool

    DEFF Research Database (Denmark)

    Arumugam, Manimozhiyan; Harrington, Eoghan D; Foerstner, Konrad U

    2010-01-01

    SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate the quanti......SUMMARY: SmashCommunity is a stand-alone metagenomic annotation and analysis pipeline suitable for data from Sanger and 454 sequencing technologies. It supports state-of-the-art software for essential metagenomic tasks such as assembly and gene prediction. It provides tools to estimate...... the quantitative phylogenetic and functional compositions of metagenomes, to compare compositions of multiple metagenomes and to produce intuitive visual representations of such analyses. AVAILABILITY: SmashCommunity is freely available at http://www.bork.embl.de/software/smash CONTACT: bork@embl.de....

  16. SABRE: A Tool for Stochastic Analysis of Biochemical Reaction Networks

    CERN Document Server

    Didier, Frederic; Mateescu, Maria; Wolf, Verena

    2010-01-01

    The importance of stochasticity within biological systems has been shown repeatedly during the last years and has raised the need for efficient stochastic tools. We present SABRE, a tool for stochastic analysis of biochemical reaction networks. SABRE implements fast adaptive uniformization (FAU), a direct numerical approximation algorithm for computing transient solutions of biochemical reaction networks. Biochemical reactions networks represent biological systems studied at a molecular level and these reactions can be modeled as transitions of a Markov chain. SABRE accepts as input the formalism of guarded commands, which it interprets either as continuous-time or as discrete-time Markov chains. Besides operating in a stochastic mode, SABRE may also perform a deterministic analysis by directly computing a mean-field approximation of the system under study. We illustrate the different functionalities of SABRE by means of biological case studies.

  17. A Performance Analysis Tool for PVM Parallel Programs

    Institute of Scientific and Technical Information of China (English)

    Chen Wang; Yin Liu; Changjun Jiang; Zhaoqing Zhang

    2004-01-01

    In this paper,we introduce the design and implementation of ParaVT,which is a visual performance analysis and parallel debugging tool.In ParaVT,we propose an automated instrumentation mechanism. Based on this mechanism,ParaVT automatically analyzes the performance bottleneck of parallel applications and provides a visual user interface to monitor and analyze the performance of parallel programs.In addition ,it also supports certain extensions.

  18. Ethics Auditing and Conflict Analysis as Management Tools

    OpenAIRE

    2008-01-01

    This paper deals with management tools like conflict analysis and ethics auditing. Ethics auditing is understood as an opportunity and agreement to devise a system to inform on ethical corporate behaviour. This system essentially aims to increase the transparency and credibility of a companyís commitment to ethics. At the same time, the process of elaborating this system allows us to introduce the moral dimension into the companyís actions and decisions, thereby completing a key dimension of ...

  19. COMPARISON OF MALAYSIA MANUFACTURING COMPANIES BY FINANCIAL STATEMENT ANALYSIS TOOLS

    OpenAIRE

    MALEK, Afagh; Mohammadi, Maryam; NASSIRI, Fardokht

    2012-01-01

    One of the best ways to get the expected results from trading in the stock market is to acquire a good evaluation of companies’ performance. Similarly, this study aims at comparing the financial performance of Lb Aluminium Berhad and Seal Incorporated Berhad manufacturing companies, which are listed in the main market of Malaysian stock exchange. The data were gathered from the annual reports of companies during last three years and analysed by financial statement analysis tools, which are ...

  20. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  1. THRSTER: A THRee-STream Ejector Ramjet Analysis and Design Tool

    Science.gov (United States)

    Chue, R. S.; Sabean, J.; Tyll, J.; Bakos, R. J.

    2000-01-01

    An engineering tool for analyzing ejectors in rocket based combined cycle (RBCC) engines has been developed. A key technology for multi-cycle RBCC propulsion systems is the ejector which functions as the compression stage of the ejector ramjet cycle. The THRee STream Ejector Ramjet analysis tool was developed to analyze the complex aerothermodynamic and combustion processes that occur in this device. The formulated model consists of three quasi-one-dimensional streams, one each for the ejector primary flow, the secondary flow, and the mixed region. The model space marches through the mixer, combustor, and nozzle to evaluate the solution along the engine. In its present form, the model is intended for an analysis mode in which the diffusion rates of the primary and secondary into the mixed stream are stipulated. The model offers the ability to analyze the highly two-dimensional ejector flowfield while still benefits from the simplicity and speed of an engineering tool. To validate the developed code, wall static pressure measurements from the Penn-State and NASA-ART RBCC experiments were used to compare with the results generated by the code. The calculated solutions were generally found to have satisfactory agreement with the pressure measurements along the engines, although further modeling effort may be required when a strong shock train is formed at the rocket exhaust. The range of parameters in which the code would generate valid results are presented and discussed.

  2. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  3. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  4. The RUMBA software: tools for neuroimaging data analysis.

    Science.gov (United States)

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  5. Fractal analysis of flow of the river Warta

    Science.gov (United States)

    Radziejewski, Maciej; Kundzewicz, Zbigniew W.

    1997-12-01

    A long time series (170 years) of daily flows of the river Warta (Poland) are subject to fractal analysis. A binary variable (renewal stream) illustrating excursions of the process of flow is examined. The raw series is subject to de-seasonalization and normalization. Fractal dimensions of crossings of Warta flows are determined using a novel variant of the box-counting method. Temporal variability of the flow process is studied by determination of fractal dimensions for shifted horizons of 10 or 30 years length. Spectral properties are compared between the time series of flows, and the fractional Brownian motion which describes both the fractal structure of the process and the Hurst phenomenon. The approach may be useful in further studies of non-stationary of the process of flow, analysis of extreme hydrological events and synthetic flow generation.

  6. Analysis of Stokes flow through periodic permeable tubules

    Directory of Open Access Journals (Sweden)

    A.M. Siddiqui

    2017-03-01

    Full Text Available This article reports the detailed analysis of the Stokes flow through permeable tubes. The objective of this investigation was to search for exact solutions to the Stokes flow and thereby observe the effects on radial flow component, provided the permeability on the tubular surface is an elementary trigonometric function. Mathematical expressions for the pressure distribution, velocity components, volume flux, average wall shear stress and leakage flux are presented explicitly. Graphical analysis of the fluid flow is presented for a set of parametric values. Important conclusions are drawn for Stokes flow through tubes with low as well as high permeability. The classical Poiseuille flow is presented as a limiting case of this immense study of Stokes flow.

  7. Mean flow stability analysis of oscillating jet experiments

    CERN Document Server

    Oberleithner, Kilian; Soria, Julio

    2014-01-01

    Linear stability analysis is applied to the mean flow of an oscillating round jet with the aim to investigate the robustness and accuracy of mean flow stability wave models. The jet's axisymmetric mode is excited at the nozzle lip through a sinusoidal modulation of the flow rate at amplitudes ranging from 0.1 % to 100 %. The instantaneous flow field is measured via particle image velocimetry and decomposed into a mean and periodic part utilizing proper orthogonal decomposition. Local linear stability analysis is applied to the measured mean flow adopting a weakly nonparallel flow approach. The resulting global perturbation field is carefully compared to the measurements in terms of spatial growth rate, phase velocity, and phase and amplitude distribution. It is shown that the stability wave model accurately predicts the excited flow oscillations during their entire growth phase and during a large part of their decay phase. The stability wave model applies over a wide range of forcing amplitudes, showing no pr...

  8. Judo match analysis,a powerful coaching tool, basic and advanced tools

    CERN Document Server

    Sacripanti, A

    2013-01-01

    In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

  9. ANALYSIS OF DEBRIS FLOW BEHAVIOR USING AIRBORNE LIDAR AND IMAGE DATA

    Directory of Open Access Journals (Sweden)

    G. Kim

    2016-06-01

    Full Text Available The frequency of debris flow events caused by severe rainstorms has increased in Korea. LiDAR provides high-resolution topographical data that can represent the land surface more effectively than other methods. This study describes the analysis of geomorphologic changes using digital surface models derived from airborne LiDAR and aerial image data acquired before and after a debris flow event in the southern part of Seoul, South Korea in July 2011. During this event, 30 houses were buried, 116 houses were damaged, and 22 human casualties were reported. Longitudinal and cross-sectional profiles of the debris flow path reconstructed from digital surface models were used to analyze debris flow behaviors such as landslide initiation, transport, erosion, and deposition. LiDAR technology integrated with GIS is a very useful tool for understanding debris flow behavior.

  10. Analysis of Debris Flow Behavior Using Airborne LIDAR and Image Data

    Science.gov (United States)

    Kim, G.; Yune, C. Y.; Paik, J.; Lee, S. W.

    2016-06-01

    The frequency of debris flow events caused by severe rainstorms has increased in Korea. LiDAR provides high-resolution topographical data that can represent the land surface more effectively than other methods. This study describes the analysis of geomorphologic changes using digital surface models derived from airborne LiDAR and aerial image data acquired before and after a debris flow event in the southern part of Seoul, South Korea in July 2011. During this event, 30 houses were buried, 116 houses were damaged, and 22 human casualties were reported. Longitudinal and cross-sectional profiles of the debris flow path reconstructed from digital surface models were used to analyze debris flow behaviors such as landslide initiation, transport, erosion, and deposition. LiDAR technology integrated with GIS is a very useful tool for understanding debris flow behavior.

  11. From continuous flow analysis to programmable Flow Injection techniques. A history and tutorial of emerging methodologies.

    Science.gov (United States)

    Ruzicka, Jaromir Jarda

    2016-09-01

    Automation of reagent based assays, also known as Flow Analysis, is based on sample processing, in which a sample flows towards and through a detector for monitoring of its components. The Achilles heel of this methodology is that the majority of FA techniques use constant continuous forward flow to transport the sample - an approach which continually consumes reagents and generates chemical waste. Therefore the purpose of this report is to highlight recent developments of flow programming that not only save reagents, but also lead by means of advanced sample processing to selective and sensitive assays based on stop flow measurement. Flow programming combined with a novel approach to data harvesting yields a novel approach to single standard calibration, and avoids interference caused by refractive index. Finally, flow programming is useful for sample preparation, such as rapid, extensive sample dilution. The principles are illustrated by selected references to an available online tutorial http://www.flowinjectiontutorial,com/.

  12. Progress toward the analysis of complex propulsion installation flow phenomenon

    Science.gov (United States)

    Kern, P. R. A.; Hopcroft, R. G.

    1983-01-01

    A trend toward replacement of parametric model testing with parametric analysis for the design of aircraft is driven by the rapidly escalating cost of wind tunnel testing, the increasing availability of large fast computers, and powerful numerical flow algorithms. In connection with the complex flow phenomena characteristic of propulsion installations, it is now necessary to employ both parametric analysis and testing for design procedures. Powerful flow analysis techniques are available to predict local flow phenomena. However, the employment of these techniques is very expensive. It is, therefore, necessary to link these analyses with less powerful and less expensive procedures for an accurate analysis of propulsion installation flowfields. However, the interfacing and coupling processes needed are not available. The present investigation is concerned with progress made regarding the development of suitable linking methods. Attention is given to methods of analysis for predicting the flow around a nacelle coupled to a highly swept wing.

  13. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  14. Shot planning and analysis tools on the NIF project

    Energy Technology Data Exchange (ETDEWEB)

    Beeler, R. [Lawrence Livermore National Laboratory, Livermore, CA (United States); Casey, A., E-mail: casey20@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA (United States); Conder, A.; Fallejo, R.; Flegel, M.; Hutton, M.; Jancaitis, K.; Lakamsani, V.; Potter, D.; Reisdorf, S.; Tappero, J.; Whitman, P.; Carr, W.; Liao, Z. [Lawrence Livermore National Laboratory, Livermore, CA (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Target shots in NIF, dozens a month, vary widely in laser and target configuration. Black-Right-Pointing-Pointer A planning tool helps select shot sequences that optimize valuable facility time. Black-Right-Pointing-Pointer Fabrication and supply of targets, diagnostics, etc. are integrated into the plan. Black-Right-Pointing-Pointer Predictive modeling of aging parts (e.g., optics) aids maintenance decision support. Black-Right-Pointing-Pointer We describe the planning/analysis tool and its use in NIF experimental operations. - Abstract: Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Campaign Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modeling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics

  15. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    CERN Document Server

    Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

    2014-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

  16. CLUSTERING ANALYSIS OF DEBRIS-FLOW STREAMS

    Institute of Scientific and Technical Information of China (English)

    Yuan-Fan TSAI; Huai-Kuang TSAI; Cheng-Yan KAO

    2004-01-01

    The Chi-Chi earthquake in 1999 caused disastrous landslides, which triggered numerous debris flows and killed hundreds of people. A critical rainfall intensity line for each debris-flow stream is studied to prevent such a disaster. However, setting rainfall lines from incomplete data is difficult, so this study considered eight critical factors to group streams, such that streams within a cluster have similar rainfall lines. A genetic algorithm is applied to group 377 debris-flow streams selected from the center of an area affected by the Chi-Chi earthquake. These streams are grouped into seven clusters with different characteristics. The results reveal that the proposed method effectively groups debris-flow streams.

  17. ECCS flow verification to support transient analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kovach, C.; Jacobs, R.H.; Ballard, J.E. [Commonwealth Edison Co., Chicago, IL (United States). Nuclear Fuel Services Dept.

    1994-12-31

    The RETRAN code has been used to develop a model of the Emergency Core Cooling System (ECCS). The model was developed in order to provide conservative injection flow data to be used in various LOCA and non-LOCA analyses and evaluations and to ensure that ECCS pump runout does not occur. The analyses were also needed in order to address a number of ECCS performance issues identified by Westinghouse. These issues include how previous analyses modeled miniflow, RCP seal injection, ECCS branch line resistance, pump suction boost during recirculation, injection line flow imbalances, and, of particular importance, ECCS flow measurement inaccuracies. In turn, these issues directly impact pump runout concerns, Technical Specification verification, and ECCS injection flow during transient conditions. The RETRAN ECCS model has proven to be quite versatile, easy to use, and requires only minimal information about the physical construction and performance of the ECCS system.

  18. Stereo Scene Flow for 3D Motion Analysis

    CERN Document Server

    Wedel, Andreas

    2011-01-01

    This book presents methods for estimating optical flow and scene flow motion with high accuracy, focusing on the practical application of these methods in camera-based driver assistance systems. Clearly and logically structured, the book builds from basic themes to more advanced concepts, culminating in the development of a novel, accurate and robust optic flow method. Features: reviews the major advances in motion estimation and motion analysis, and the latest progress of dense optical flow algorithms; investigates the use of residual images for optical flow; examines methods for deriving mot

  19. Finite element analysis of inviscid subsonic boattail flow

    Science.gov (United States)

    Chima, R. V.; Gerhart, P. M.

    1981-01-01

    A finite element code for analysis of inviscid subsonic flows over arbitrary nonlifting planar or axisymmetric bodies is described. The code solves a novel primitive variable formulation of the coupled irrotationality and compressible continuity equations. Results for flow over a cylinder, a sphere, and a NACA 0012 airfoil verify the code. Computed subcritical flows over an axisymmetric boattailed afterbody compare well with finite difference results and experimental data. Interative coupling with an integral turbulent boundary layer code shows strong viscous effects on the inviscid flow. Improvements in code efficiency and extensions to transonic flows are discussed.

  20. Flow networks analysis and optimization of repairable flow networks, networks with disturbed flows, static flow networks and reliability networks

    CERN Document Server

    Todinov, Michael T

    2013-01-01

    Repairable flow networks are a new area of research, which analyzes the repair and flow disruption caused by failures of components in static flow networks. This book addresses a gap in current network research by developing the theory, algorithms and applications related to repairable flow networks and networks with disturbed flows. The theoretical results presented in the book lay the foundations of a new generation of ultra-fast algorithms for optimizing the flow in networks after failures or congestion, and the high computational speed creates the powerful possibility of optimal control

  1. Graphical tools for network meta-analysis in STATA.

    Directory of Open Access Journals (Sweden)

    Anna Chaimani

    Full Text Available Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  2. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  3. Graphical tools for network meta-analysis in STATA.

    Science.gov (United States)

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  4. Hemodynamic flow modeling through an abdominal aorta aneurysm using data mining tools.

    Science.gov (United States)

    Filipovic, Nenad; Ivanovic, Milos; Krstajic, Damjan; Kojic, Milos

    2011-03-01

    Geometrical changes of blood vessels, called aneurysm, occur often in humans with possible catastrophic outcome. Then, the blood flow is enormously affected, as well as the blood hemodynamic interaction forces acting on the arterial wall. These forces are the cause of the wall rupture. A mechanical quantity characteristic for the blood-wall interaction is the wall shear stress, which also has direct physiological effects on the endothelial cell behavior. Therefore, it is very important to have an insight into the blood flow and shear stress distribution when an aneurysm is developed in order to help correlating the mechanical conditions with the pathogenesis of pathological changes on the blood vessels. This insight can further help in improving the prevention of cardiovascular diseases evolution. Computational fluid dynamics (CFD) has been used in general as a tool to generate results for the mechanical conditions within blood vessels with and without aneurysms. However, aneurysms are very patient specific and reliable results from CFD analyses can be obtained by a cumbersome and time-consuming process of the computational model generation followed by huge computations. In order to make the CFD analyses efficient and suitable for future everyday clinical practice, we have here employed data mining (DM) techniques. The focus was to combine the CFD and DM methods for the estimation of the wall shear stresses in an abdominal aorta aneurysm (AAA) underprescribed geometrical changes. Additionally, computing on the grid infrastructure was performed to improve efficiency, since thousands of CFD runs were needed for creating machine learning data. We used several DM techniques and found that our DM models provide good prediction of the shear stress at the AAA in comparison with full CFD model results on real patient data.

  5. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    Science.gov (United States)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  6. Use of Grid Tools to Support CMS Distributed Analysis

    CERN Document Server

    Fanfani, A; Anjum, A; Barrass, T; Bonacorsi, D; Bunn, J; Corvo, M; Darmenov, N; De Filippis, N; Donno, F; Donvito, G; Eulisse, G; Fanzago, F; Filine, A; Grandi, C; Hernández, J M; Innocente, V; Jan, A; Lacaprara, S; Legrand, I; Metson, S; Newman, H; Silvestris, L; Steenberg, C; Stockinger, H; Taylor, L; Thomas, M; Tuura, L; Van Lingen, F; Wildish, T

    2004-01-01

    In order to prepare the Physic Technical Design Report, due by end of 2005, the CMS experiment needs to simulate, reconstruct and anlayse about 100 million events, corresponding to more than 200 TB of data. The data will be distributed to several Computing Centres. In order to provide access to the whole data sample to all the world-wide dispersed physicists, CMS is developing a layer of software that uses the grid tools provided by the LCG project to gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in gain access to data and resources and that aims to provide physicists with a user friendly interface for submitting analysis jobs. The GRID tools used are both those already available in the LCG-2 release and those being developed in the framework of the ARDA project. This work describes the current status and the future development...

  7. Measurement of anterior and posterior circulation flow contributions to cerebral blood flow. An ultrasound-derived volumetric flow analysis.

    Science.gov (United States)

    Boyajian, R A; Schwend, R B; Wolfe, M M; Bickerton, R E; Otis, S M

    1995-01-01

    Ultrasound-derived volumetric flow analysis may be useful in answering questions of basic physiological interest in the cerebrovascular circulation. Using this technique, the authors have sought to describe quantitatively the complete concurrent flow relations among all four arteries supplying the brain. The aim of this study of normal subjects was to determine the relative flow contributions of the anterior (internal carotid arteries) and posterior (vertebral arteries) cerebral circulation. Comparisons between the observed and theoretically expected anterior and posterior flow distribution would provide an opportunity to assess traditional rheological conceptions in vivo. Pulsed color Doppler ultrasonography was used to measure mean flow rates in the internal carotid and vertebral arteries in 21 normal adults. The anterior circulation (internal carotid arteries bilaterally) carried 82% of the brain's blood supply and comprised 67% of the total vascular cross-sectional area. These values demonstrate precise concordance between observations in vivo and the theoretically derived (Hagen-Poiseuille) expected flow distribution. These cerebrovascular findings support the traditional conception of macroscopic blood flow. Further studies using ultrasound-derived volumetric analysis of the brain's arterial flow relations may illuminate the vascular pathophysiology underlying aging, cerebral ischemia, and dementias.

  8. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    Science.gov (United States)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  9. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    Science.gov (United States)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  10. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  11. ON THE ANALYSIS OF IMPEDANCE-DRIVEN REVERSE FLOW DYNAMICS

    Directory of Open Access Journals (Sweden)

    LEE V. C.-C.

    2017-02-01

    Full Text Available Impedance pump is a simple valve-less pumping mechanism, where an elastic tube is joined to a more rigid tube, at both ends. By inducing a periodic asymmetrical compression on the elastic tube will produce a unidirectional flow within the system. This pumping concept offers a low energy, low noise alternative, which makes it an effective driving mechanism, especially for micro-fluidic systems. In addition, the wave-based mechanism through which pumping occurs infers many benefits in terms of simplicity of design and manufacturing. Adjustment of simple parameters such as the excitation frequencies or compression locations will reverse the direction of flow, providing a very versatile range of flow outputs. This paper describes the experimental analysis of such impedance-driven flow with emphasis on the dynamical study of the reverse flow in open-loop environment. In this study, tapered section with converging steps is introduced at both ends of the elastic tube to amplify the magnitude of reverse flow. Study conducted shows that the reverse peak flow is rather significant with estimate of 23% lower than the forward peak flow. The flow dynamics on the other hand has shown to exhibit different characteristics as per the forward peak flow. The flow characteristics is then studied and showed that the tapered sections altered the impedance within the system and hence induce a higher flow in the reverse direction.

  12. Quantitative lateral flow strip assays as User-Friendly Tools To Detect Biomarker Profiles For Leprosy

    Science.gov (United States)

    van Hooij, Anouk; Tjon Kon Fat, Elisa M.; Richardus, Renate; van den Eeden, Susan J. F.; Wilson, Louis; de Dood, Claudia J.; Faber, Roel; Alam, Korshed; Richardus, Jan Hendrik; Corstjens, Paul L. A. M.; Geluk, Annemieke

    2016-01-01

    Leprosy is a debilitating, infectious disease caused by Mycobacterium leprae. Despite the availability of multidrug therapy, transmission is unremitting. Thus, early identification of M. leprae infection is essential to reduce transmission. The immune response to M. leprae is determined by host genetics, resulting in paucibacillary (PB) and multibacillary (MB) leprosy associated with dominant cellular or humoral immunity, respectively. This spectral pathology of leprosy compels detection of immunity to M. leprae to be based on multiple, diverse biomarkers. In this study we have applied quantitative user friendly lateral flow assays (LFAs) for four immune markers (anti-PGL-I antibodies, IL-10, CCL4 and IP-10) for whole blood samples from a longitudinal BCG vaccination field-trial in Bangladesh. Different biomarker profiles, in contrast to single markers, distinguished M. leprae infected from non-infected test groups, patients from household contacts (HHC) and endemic controls (EC), or MB from PB patients. The test protocol presented in this study merging detection of innate, adaptive cellular as well as humoral immunity, thus provides a convenient tool to measure specific biomarker profiles for M. leprae infection and leprosy utilizing a field-friendly technology. PMID:27682181

  13. A study on fluid flow simulation in the cooling systems of machine tools

    Science.gov (United States)

    Olaru, I.

    2016-08-01

    This paper aims analysing the type of coolants and the correct choice of that as well as the dispensation in the processing area to control the temperature resulted from the cutting operation and the choose of the cutting operating modes. A high temperature in the working area over a certain amount can be harmful in terms of the quality of resulting surface and that could have some influences on the life of the cutting tool. The coolant chosen can be a combination of different cooling fluids in order to achieve a better cooling of the cutting area at the same time for carrying out the proper lubrication of that area. The fluid flow parameters of coolant can be influenced by the nature of the fluid or fluids used, the geometry of the nozzle used which generally has a convergent-divergent geometry in order to achieve a better dispersion of the coolant / lubricant on the area to be processed. A smaller amount of fluid is important in terms of the economy lubricant, because they are quite expensive. A minimal amount of lubricant may have a better impact on the environment and the health of the operator because the coolants in contact with overheated machined surface may develop a substantial amount of these gases that are not always beneficial to health.

  14. Application of effective discharge analysis to environmental flow decision-making

    Science.gov (United States)

    McKay, S. Kyle; Freeman, Mary C.; Covich, A.P.

    2016-01-01

    Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.

  15. Application of Effective Discharge Analysis to Environmental Flow Decision-Making

    Science.gov (United States)

    McKay, S. Kyle; Freeman, Mary C.; Covich, Alan P.

    2016-06-01

    Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.

  16. Flow analysis of C. elegans swimming

    Science.gov (United States)

    Montenegro-Johnson, Thomas; Gagnon, David; Arratia, Paulo; Lauga, Eric

    2015-11-01

    Improved understanding of microscopic swimming has the potential to impact numerous biomedical and industrial processes. A crucial means of analyzing these systems is through experimental observation of flow fields, from which it is important to be able to accurately deduce swimmer physics such as power consumption, drag forces, and efficiency. We examine the swimming of the nematode worm C. elegans, a model system for undulatory micro-propulsion. Using experimental data of swimmer geometry and kinematics, we employ the regularized stokeslet boundary element method to simulate the swimming of this worm outside the regime of slender-body theory. Simulated flow fields are then compared with experimentally extracted values confined to the swimmer beat plane, demonstrating good agreement. We finally address the question of how to estimate three-dimensional flow information from two-dimensional measurements.

  17. GLIDER: Free tool imagery data visualization, analysis and mining

    Science.gov (United States)

    Ramachandran, R.; Graves, S. J.; Berendes, T.; Maskey, M.; Chidambaram, C.; Hogan, P.; Gaskin, T.

    2009-12-01

    Satellite imagery can be analyzed to extract thematic information, which has increasingly been used as a source of information for making policy decisions. The uses of such thematic information can vary from military applications such as detecting assets of interest to science applications such as characterizing land-use/land cover change at local, regional and global scales. However, extracting thematic information using satellite imagery is a non-trivial task. It requires a user to preprocess the data by applying operations for radiometric and geometric corrections. The user also needs to be able to visualize the data and apply different image enhancement operations to digitally improve the images to identify subtle information that might be otherwise missed. Finally, the user needs to apply different information extraction algorithms to the imagery to obtain the thematic information. At present, there are limited tools that provide users with the capability to easily extract and exploit the information contained within the satellite imagery. This presentation will present GLIDER, a free software tool addressing this void. GLIDER provides users with a easy to use tool to visualize, analyze and mine satellite imagery. GLIDER allows users to visualize and analyze satellite in its native sensor view, an important capability because any transformation to either a geographic coordinate system or any projected coordinate system entails spatial and intensity interpolation; and hence, loss of information. GLIDER allows users to perform their analysis in the native sensor view without any loss of information. GLIDER provides users with a full suite of image processing algorithms that can be used to enhance the satellite imagery. It also provides pattern recognition and data mining algorithms for information extraction. GLIDER allows its users to project satellite data and the analysis/mining results onto to a globe and overlay additional data layers. Traditional analysis

  18. Theory, methods and tools for determining environmental flows for riparian vegetation: Riparian vegetation-flow response guilds

    Science.gov (United States)

    Merritt, D.M.; Scott, M.L.; Leroy, Poff N.; Auble, G.T.; Lytle, D.A.

    2010-01-01

    Riparian vegetation composition, structure and abundance are governed to a large degree by river flow regime and flow-mediated fluvial processes. Streamflow regime exerts selective pressures on riparian vegetation, resulting in adaptations (trait syndromes) to specific flow attributes. Widespread modification of flow regimes by humans has resulted in extensive alteration of riparian vegetation communities. Some of the negative effects of altered flow regimes on vegetation may be reversed by restoring components of the natural flow regime. 2. Models have been developed that quantitatively relate components of the flow regime to attributes of riparian vegetation at the individual, population and community levels. Predictive models range from simple statistical relationships, to more complex stochastic matrix population models and dynamic simulation models. Of the dozens of predictive models reviewed here, most treat one or a few species, have many simplifying assumptions such as stable channel form, and do not specify the time-scale of response. In many cases, these models are very effective in developing alternative streamflow management plans for specific river reaches or segments but are not directly transferable to other rivers or other regions. 3. A primary goal in riparian ecology is to develop general frameworks for prediction of vegetation response to changing environmental conditions. The development of riparian vegetation-flow response guilds offers a framework for transferring information from rivers where flow standards have been developed to maintain desirable vegetation attributes, to rivers with little or no existing information. 4. We propose to organise riparian plants into non-phylogenetic groupings of species with shared traits that are related to components of hydrologic regime: life history, reproductive strategy, morphology, adaptations to fluvial disturbance and adaptations to water availability. Plants from any river or region may be grouped

  19. Through flow analysis of pumps and fans

    Science.gov (United States)

    Neal, A. N.

    1980-08-01

    Incompressible through flow calculations in axial, mixed and centrifugal flow pumps and fans are described. An iterative scheme is used. A simple blade to blade model is applied on the surfaces of revolution defined by the meridional streamlines. This defines the fluid properties and the mean stream surface (S2 surface) for the next meridional solution. A computer program is available allowing the method to be applied for design purposes. APL is used for input and output and FORTRAN IV for computation. A typical calculation requires 30 sec of Univac 1100 time.

  20. Basic statistical tools in research and data analysis

    Science.gov (United States)

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  1. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  2. Basic statistical tools in research and data analysis.

    Science.gov (United States)

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-09-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  3. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  4. GOMA: functional enrichment analysis tool based on GO modules

    Science.gov (United States)

    Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213

  5. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  6. Chromosomes in the flow to simplify genome analysis.

    Science.gov (United States)

    Doležel, Jaroslav; Vrána, Jan; Safář, Jan; Bartoš, Jan; Kubaláková, Marie; Simková, Hana

    2012-08-01

    Nuclear genomes of human, animals, and plants are organized into subunits called chromosomes. When isolated into aqueous suspension, mitotic chromosomes can be classified using flow cytometry according to light scatter and fluorescence parameters. Chromosomes of interest can be purified by flow sorting if they can be resolved from other chromosomes in a karyotype. The analysis and sorting are carried out at rates of 10(2)-10(4) chromosomes per second, and for complex genomes such as wheat the flow sorting technology has been ground-breaking in reducing genome complexity for genome sequencing. The high sample rate provides an attractive approach for karyotype analysis (flow karyotyping) and the purification of chromosomes in large numbers. In characterizing the chromosome complement of an organism, the high number that can be studied using flow cytometry allows for a statistically accurate analysis. Chromosome sorting plays a particularly important role in the analysis of nuclear genome structure and the analysis of particular and aberrant chromosomes. Other attractive but not well-explored features include the analysis of chromosomal proteins, chromosome ultrastructure, and high-resolution mapping using FISH. Recent results demonstrate that chromosome flow sorting can be coupled seamlessly with DNA array and next-generation sequencing technologies for high-throughput analyses. The main advantages are targeting the analysis to a genome region of interest and a significant reduction in sample complexity. As flow sorters can also sort single copies of chromosomes, shotgun sequencing DNA amplified from them enables the production of haplotype-resolved genome sequences. This review explains the principles of flow cytometric chromosome analysis and sorting (flow cytogenetics), discusses the major uses of this technology in genome analysis, and outlines future directions.

  7. Speed-Flow Analysis for Interrupted Oversaturated Traffic Flow with Heterogeneous Structure for Urban Roads

    Directory of Open Access Journals (Sweden)

    Hemant Kumar Sharma

    2012-06-01

    Full Text Available Speed–flow functions have been developed by several transportation experts to predict accurately the speed of urban road network. HCM Speed-Flow Curve, BPR Curve, MTC Speed-Flow Curve, Akçelik Speed-Flow Curve are some extraordinary efforts to define the shape of speed-flow curve. However, the complexity of driver’s behaviour, interactions among different type of vehicles, lateral clearance, co-relation of driver’s psychology with vehicular characteristics and interdependence of various variables of traffic has led to continuous development and refinement of speed-flow curves. The problem gets more tedious in case of urban roads with heterogeneous traffic, oversaturated flow and signalized network (which includes some unsignalized intersections as well. This paper presents speed-flow analysis for urban roads with interrupted flow comprising of heterogeneous traffic. Model has been developed for heterogeneous traffic under constraints of roadway geometry, vehicle characteristics, driving behaviour and traffic controls. The model developed in this paper shall predict speed, delay, average queue and maximum queue estimates for urban roads and quantify congestion for oversaturated condition. The investigation details oversaturated portion of flow in particular.

  8. Micro fibre optic flow checker for the medical analysis application.

    Science.gov (United States)

    Wang, Danping

    2007-01-01

    Two micro fibre optic flow checkers are presented in this paper. They are used for a medical analysis to control a solvent flow up to 1microl/min resolution. A fibre optic sensor as well as a hydraulic system are the principle components of these flow checkers. This paper describes the principle and the experiment setup. It gives the linearity, the repeatability and the stability results.

  9. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  10. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  11. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  12. Anaphe—OO Libraries and Tools for Data Analysis

    Institute of Scientific and Technical Information of China (English)

    O.Couet; B.Ferrero-Merlino; 等

    2001-01-01

    The Anaple project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments,A range of commercial and public domain libraries is used to cover basic functionalities;on top of these libraries a set of HENP-sepcific C++ class libraries for histogram management fitting,plotting and ntuple-like data analysis has been developed .In order to comply with the user requireements for a command-line driven tool,we have chosen to use a scripting language(Python)as the fromt-ent for a data analysis tool.The loose coupling provided by the consequent use of (AIDA compliant)Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provies an easy integration of existing libraries into modern scipting languages thus allowing for rapid application development.This integration is simplified even further suing a specialised toolkit(SWIG)to create" shadow Classes"for the Python language,which map the definitions of the Abstract Interfaces almost at a one-to-one level.This paper will give an overview of the architecture and design choices and will present the current status and future developments of the project.

  13. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    Science.gov (United States)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  14. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    Science.gov (United States)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  15. Riparian trees as common denominators across the river flow spectrum: are ecophysiological methods useful tools in environmental flow assessments?

    CSIR Research Space (South Africa)

    Schachtschneider, K

    2014-04-01

    Full Text Available , geohydrological and geomorphological conditions. This paper tests physiological differences among trees along rivers with varying flow regimes. In this study 3 parameters were selected and tested, namely wood density, specific leaf area and water use efficiency...

  16. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Stregy, Seth [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Dasilva, Ana [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Yilmaz, Serkan [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Saha, Pradip [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Loewen, Eric [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States)

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  17. NOKIA PERFORMANCE AND CASH FLOW ANALYSIS

    Directory of Open Access Journals (Sweden)

    Moscviciov Andrei

    2011-12-01

    Full Text Available In this paper the author presents the ways to analyze the performance of the company Nokia. Thus based on a system of indicators are highlighted the key situations that emphasize performance, namely: operational activity, financial balance, cash flows.

  18. Migration Flows: Measurement, Analysis and Modeling

    NARCIS (Netherlands)

    Willekens, F.J.; White, Michael J.

    2016-01-01

    This chapter is an introduction to the study of migration flows. It starts with a review of major definition and measurement issues. Comparative studies of migration are particularly difficult because different countries define migration differently and measurement methods are not harmonized. Insigh

  19. Analyzing highway flow patterns using cluster analysis

    NARCIS (Netherlands)

    Weijermars, Wendy; van Berkum, Eric C.; Pfliegl, R.

    2005-01-01

    Historical traffic patterns can be used for the prediction of traffic flows, as input for macroscopic traffic models, for the imputation of missing or erroneous data and as a basis for traffic management scenarios. This paper investigates the determination of historical traffic patterns by means of

  20. LTE uplink scheduling - Flow level analysis

    NARCIS (Netherlands)

    Dimitrova, D.C.; Berg, J.L. van den; Heijenk, G.; Litjens, R.

    2011-01-01

    Long Term Evolution (LTE) is a cellular technology foreseen to extend the capacity and improve the performance of current 3G cellular networks. A key mechanism in the LTE traffic handling is the packet scheduler, which is in charge of allocating resources to active flows in both the frequency and ti

  1. Migration Flows: Measurement, Analysis and Modeling

    NARCIS (Netherlands)

    Willekens, F.J.; White, Michael J.

    2016-01-01

    This chapter is an introduction to the study of migration flows. It starts with a review of major definition and measurement issues. Comparative studies of migration are particularly difficult because different countries define migration differently and measurement methods are not harmonized.

  2. Migration Flows: Measurement, Analysis and Modeling

    NARCIS (Netherlands)

    Willekens, F.J.; White, Michael J.

    2016-01-01

    This chapter is an introduction to the study of migration flows. It starts with a review of major definition and measurement issues. Comparative studies of migration are particularly difficult because different countries define migration differently and measurement methods are not harmonized. Insigh

  3. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  4. Simulation Process Analysis of Rubber Shock Absorber for Machine Tool

    Directory of Open Access Journals (Sweden)

    Chai Rong Xia

    2016-01-01

    Full Text Available The simulation on rubber shock absorber of machine tool was studied. The simple material model of rubber was obtained by through the finite element analysis software ABAQUS. The compression speed and the hardness of rubber material were considered to obtain the deformation law of rubber shock absorber. The location of fatigue were confirmed from the simulation results. The results shown that the fatigue position is distributed in the corner of shock absorber. The degree of deformation is increased with increasing of compress speed, and the hardness of rubber material is proportional to deformation.

  5. 3D-Aided-Analysis Tool for Lunar Rover

    Institute of Scientific and Technical Information of China (English)

    ZHANG Peng; LI Guo-peng; REN Xin; LIU Jian-jun; GAO Xing-ye; ZOU Xiao-duan

    2013-01-01

    3D-Aided-Analysis Tool (3DAAT) which is a virtual reality system is built up in this paper. 3DAAT is integrated with kinematics and dynamics model of rover as well as real lunar surface terrain mode. Methods of modeling which are proposed in this paper include constructing lunar surface, constructing 3D model of lander and rover, building up kinematic model of rover body. Photogrammetry technique and the remote sensing information are used to generate the terrain model of lunar surface. According to the implementation result, 3DAAT is an effective assist system for making exploration plan and analyzing the status of rover.

  6. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  7. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  8. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  9. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    Science.gov (United States)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  10. Numerical Tools for Multicomponent, Multiphase, Reactive Processes: Flow of CO{sub 2} in Porous Medium

    Energy Technology Data Exchange (ETDEWEB)

    Khattri, Sanjay Kumar

    2006-07-01

    The thesis is concerned with numerically simulating multicomponent, multiphase, reactive transport in heterogeneous porous medium. Such processes are ubiquitous, for example, deposition of green house gases, flow of hydrocarbons and groundwater remediation. Understanding such processes is important from social and economic point of view. For the success of geological sequestration, an accurate estimation of migration patterns of green-house gases is essential. Due to an ever increasing computer power, computational mathematics has become an important tool for predicting dynamics of porous media fluids. Numerical and mathematical modelling of processes in a domain requires grid generation in the domain, discretization of the continuum equations on the generated grid, solution of the formed linear or nonlinear system of discrete equations and finally visualization of the results. The thesis is composed of three chapters and eight papers. Chapter 2 presents two techniques for generating structured quadrilateral and hexahedral meshes. These techniques are called algebraic and elliptic methods. Algebraic techniques are by far the most simple and computationally efficient method for grid generation. Transfinite interpolation operators are a kind of algebraic grid generation technique. In this chapter, many transfinite interpolation operators for grid generation are derived from 1D projection operators. In this chapter, some important properties of hexahedral elements are also mentioned. These properties are useful in discretization of partial differential equations on hexahedral mesh, improving quality of the hexahedral mesh, mesh generation and visualization. Chapter 3 is about CO{sub 2} flow in porous media. In this chapter, we present the mathematical models and their discretization for capturing major physical processes associated with CO{sub 2} deposition in geological formations. Some important simulations of practical applications in 2D and 3D are presented

  11. Numerical Tools for Multicomponent, Multiphase, Reactive Processes: Flow of CO{sub 2} in Porous Medium

    Energy Technology Data Exchange (ETDEWEB)

    Khattri, Sanjay Kumar

    2006-07-01

    The thesis is concerned with numerically simulating multicomponent, multiphase, reactive transport in heterogeneous porous medium. Such processes are ubiquitous, for example, deposition of green house gases, flow of hydrocarbons and groundwater remediation. Understanding such processes is important from social and economic point of view. For the success of geological sequestration, an accurate estimation of migration patterns of green-house gases is essential. Due to an ever increasing computer power, computational mathematics has become an important tool for predicting dynamics of porous media fluids. Numerical and mathematical modelling of processes in a domain requires grid generation in the domain, discretization of the continuum equations on the generated grid, solution of the formed linear or nonlinear system of discrete equations and finally visualization of the results. The thesis is composed of three chapters and eight papers. Chapter 2 presents two techniques for generating structured quadrilateral and hexahedral meshes. These techniques are called algebraic and elliptic methods. Algebraic techniques are by far the most simple and computationally efficient method for grid generation. Transfinite interpolation operators are a kind of algebraic grid generation technique. In this chapter, many transfinite interpolation operators for grid generation are derived from 1D projection operators. In this chapter, some important properties of hexahedral elements are also mentioned. These properties are useful in discretization of partial differential equations on hexahedral mesh, improving quality of the hexahedral mesh, mesh generation and visualization. Chapter 3 is about CO{sub 2} flow in porous media. In this chapter, we present the mathematical models and their discretization for capturing major physical processes associated with CO{sub 2} deposition in geological formations. Some important simulations of practical applications in 2D and 3D are presented

  12. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  13. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook (and others)

    2007-08-15

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the

  14. Substance Flow Analysis of Mercury in China

    Science.gov (United States)

    Hui, L. M.; Wang, S.; Zhang, L.; Wang, F. Y.; Wu, Q. R.

    2015-12-01

    In previous studies, the emission of anthropogenic atmospheric Hg in China as well as single sector have been examined a lot. However, there might have been more Hg released as solid wastes rather than air. Hg stored in solid wastes may be released to air again when the solid wastes experience high temperature process or cause local pollution if the solid wastes are stacked casually for a long time. To trace the fate of Hg in China, this study developed the substance flow of Hg in 2010 covering all the sectors summarized in table 1. Below showed in Figure 1, the total Hg input is 2825t. The unintentional input of Hg, mined Hg, and recycled Hg account for 57%, 32% and 11% respectively. Figure 2 provides the detail information of substance flow of Hg. Byproducts from one sector may be used as raw materials of another, causing cross Hg flow between sectors. The Hg input of cement production is 303 t, of which 34% comes from coal and limestone, 33% comes from non-ferrous smelting, 23% comes from coal combustion, 7% comes from iron and steel production and 3% comes from mercury mining. Hg flowing to recycledHg production is 639 t, mainly from Hg contained in waste active carbon and mercuric chloride catalyst from VCM production and acid sludge from non-ferrous smelting. There are 20 t mercury flowing from spent mercury adding products to incineration. Figure1 and Figure 2 also show that 46% of the output Hg belongs to "Lagged release", which means this part of mercury might be released later. The "Lagged release" Hg includes 809 t Hg contained in stacked byproducts form coal combustion, non-ferrous smelting, iron and steel production, Al production, cement production and mercury mining, 161t Hg stored in the pipeline of VCM producing, 10 t Hg in fluorescent lamps that are in use and 314 t mercury stored in materials waiting to be handled with in recycled mercury plants. There is 112 t Hg stored in landfill and 129 t Hg exported abroad with the export of mercury adding

  15. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis.

    Science.gov (United States)

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f .

  16. Computational Analysis of Multi-Rotor Flows

    Science.gov (United States)

    Yoon, Seokkwan; Lee, Henry C.; Pulliam, Thomas H.

    2016-01-01

    Interactional aerodynamics of multi-rotor flows has been studied for a quadcopter representing a generic quad tilt-rotor aircraft in hover. The objective of the present study is to investigate the effects of the separation distances between rotors, and also fuselage and wings on the performance and efficiency of multirotor systems. Three-dimensional unsteady Navier-Stokes equations are solved using a spatially 5th order accurate scheme, dual-time stepping, and the Detached Eddy Simulation turbulence model. The results show that the separation distances as well as the wings have significant effects on the vertical forces of quadroror systems in hover. Understanding interactions in multi-rotor flows would help improve the design of next generation multi-rotor drones.

  17. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  18. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  19. Analysis of the Mobilization of Debris Flows

    Science.gov (United States)

    1974-10-01

    as lateral ridges pestered along the canyon walls. The debris flow mobilized in a grass-covered swale surrounded by a moderately dense growth of...water apparently rushes out of the channels much as water from a firehose and strikes the talus. The erosive power of water issuing from a firehose...normal floods. The typical mudspate-track does not, however, readily associate itself with the ravine of a permanent or powerful mountain stream, for

  20. Stacks: an analysis tool set for population genomics.

    Science.gov (United States)

    Catchen, Julian; Hohenlohe, Paul A; Bassham, Susan; Amores, Angel; Cresko, William A

    2013-06-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics.

  1. Message Correlation Analysis Tool for NOvA

    CERN Document Server

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  2. Advanced computational tools for 3-D seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A. [Oak Ridge National Lab., TN (United States)] [and others

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  3. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  4. PyRAT (python radiography analysis tool): overview

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  5. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  6. Immobilized Bioluminescent Reagents in Flow Injection Analysis.

    Science.gov (United States)

    Nabi, Abdul

    Available from UMI in association with The British Library. Bioluminescent reactions exhibits two important characteristics from an analytical viewpoint; they are selective and highly sensitive. Furthermore, bioluminescent emissions are easily measured with a simple flow-through detector based on a photomultiplier tube and the rapid and reproducible mixing of sample and expensive reagent is best achieved by a flow injection manifold. The two most important bioluminescent systems are the enzyme (luciferase)/substrate (luciferin) combinations extracted from fireflies (Photinus pyralis) and marine bacteria (Virio harveyi) which requires ATP and NAD(P)H respectively as cofactors. Reactions that generate or consume these cofactors can also be coupled to the bioluminescent reaction to provide assays for a wide range of clinically important species. A flow injection manifold for the study of bioluminescent reactions is described, as are procedures for the extraction, purification and immobilization of firefly and bacterial luciferase and oxidoreductase. Results are presented for the determination of ATP using firefly system and the determination of other enzymes and substrates participating in ATP-converting reactions e.g. creatine kinase, ATP-sulphurylase, pyruvate kinase, creatine phosphate, pyrophosphate and phophoenolypyruvate. Similarly results are presented for the determination of NAD(P)H, FMN, FMNH_2 and several dehydrogenases which produce NAD(P)H and their substrates, e.g. alcohol, L-lactate, L-malate, L-glutamate, Glucose-6-phosphate and primary bile acid.

  7. Interactive visualization and analysis of transitional flow.

    Science.gov (United States)

    Johnson, Gregory P; Calo, Victor M; Gaither, Kelly P

    2008-01-01

    A stand-alone visualization application has been developed by a multi-disciplinary, collaborative team with the sole purpose of creating an interactive exploration environment allowing turbulent flow researchers to experiment and validate hypotheses using visualization. This system has specific optimizations made in data management, caching computations, and visualization allowing for the interactive exploration of datasets on the order of 1TB in size. Using this application, the user (co-author Calo) is able to interactively visualize and analyze all regions of a transitional flow volume, including the laminar, transitional and fully turbulent regions. The underlying goal of the visualizations produced from these transitional flow simulations is to localize turbulent spots in the laminar region of the boundary layer, determine under which conditions they form, and follow their evolution. The initiation of turbulent spots, which ultimately lead to full turbulence, was located via a proposed feature detection condition and verified by experimental results. The conditions under which these turbulent spots form and coalesce are validated and presented.

  8. Nitrogen Flow Analysis in Huizhou, South China

    Science.gov (United States)

    Ma, Xiaobo; Wang, Zhaoyin; Yin, Zegao; Koenig, Albert

    2008-03-01

    Eutrophication due to uncontrolled discharges of nitrogen and phosphorus has become a serious pollution problem in many Chinese rivers. This article analyzes the nitrogen flow in Huizhou City in the East River watershed in south China. The material accounting method was applied to investigate the nitrogen flows related to human activities, which consist of the natural and anthropogenic systems. In Huizhou City, the nonpoint source pollution was quantified by the export coefficient method and the domestic discharge was estimated as the product of per capita nitrogen contribution and population. This research was conducted based on statistical information and field data from 1998 in the Huizhou City. The results indicated that the major nitrogen flows in this area were river loads, fertilizer and feedstuff imports, atmospheric deposition, animal manure volatilization, and processes related to burning and other emissions. In 1998, about 40% of the nitrogen was retained in the system and could result in potential environmental problems. Nitrogen export was mainly by rivers, which account for about 57% of the total nitrogen exported. Comparisons made between the East River and the Danube and Yangtze Rivers show that the unit area nitrogen export was of the same magnitude and the per capita nitrogen export was comparable.

  9. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  10. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  11. Airspace Flow Program Modeling in the Future ATC Concept Evaluation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Airspace Flow Program (AFP) is a new Traffic Flow Management (TFM) control technique that has entered operation in 2006. AFPs use two existing technologies,...

  12. PLAGIARISM DETECTION PROBLEMS AND ANALYSIS SOFTWARE TOOLS FOR ITS SOLVE

    Directory of Open Access Journals (Sweden)

    V. I. Shynkarenko

    2017-02-01

    Full Text Available Purpose. This study is aimed at: 1 the definition of plagiarism in texts on formal and natural languages, building a taxonomy of plagiarism; 2 identify major problems of plagiarism detection when using automated tools to solve them; 3 Analysis and systematization of information obtained during the review, testing and analysis of existing detection systems. Methodology. To identify the requirements of the software to detect plagiarism apply methods of analysis of normative documentation (legislative base and competitive tools. To check the requirements of the testing methods used and GUI interfaces review. Findings. The paper considers the concept of plagiarism issues of proliferation and classification. A review of existing systems to identify plagiarism: desktop applications, and online resources. Highlighting their functional characteristics, determine the format of the input and output data and constraints on them, customization features and access. Drill down system requirements is made. Originality. The authors proposed schemes complement the existing hierarchical taxonomy of plagiarism. Analysis of existing systems is done in terms of functionality and possibilities for use of large amounts of data. Practical value. The practical significance is determined by the breadth of the problem of plagiarism in various fields. In Ukraine, develops the legal framework for the fight against plagiarism, which requires the active solution development tasks, improvement and delivery of relevant software (PO. This work contributes to the solution of these problems. Review of existing programs, Anti-plagiarism, as well as study and research experience in the field and update the concept of plagiarism, the strategy allows it to identify more fully articulate to the functional performance requirements, the input and output of the developed software, as well as to identify the features of such software. The article focuses on the features of solving the

  13. CRITICA: coding region identification tool invoking comparative analysis

    Science.gov (United States)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  14. CRITICA: coding region identification tool invoking comparative analysis

    Science.gov (United States)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  15. Numerical Analysis of Turbulent Flows in Channels of Complex Geometry

    Science.gov (United States)

    Farbos De Luzan, Charles

    The current study proposes to follow a systematic validated approach to applied fluid mechanics problems in order to evaluate the ability of different computational fluid dynamics (CFD) to be a relevant design tool. This systematic approach involves different operations such as grid sensitivity analyses, turbulence models comparison and appropriate wall treatments, in order to define case-specific optimal parameters for industrial applications. A validation effort is performed on each study, with particle image velocimetry (PIV) experimental results as the validating metric. The first part of the dissertation lays down the principles of validation, and presents the details of a grid sensitivity analysis, as well as a turbulence models benchmark. The models are available in commercial solvers, and in most cases the default values of the equations constants are retained. The validation experimental data is taken with a hot wire, and has served as a reference to validate multiple turbulence models for turbulent flows in channels. In a second part, the study of a coaxial piping system will compare a set of different steady Reynolds-Averaged Navier Stokes (RANS) turbulence models, namely the one equation model Spalart-Almaras, and two-equation-models standard k-epsilon, k-epsilon realizable, k-epsilon RNG, standard k-omega, k-omega SST, and transition SST. The geometry of interest involves a transition from an annulus into a larger one, where highly turbulent phenomena occur, such as recirculation and jet impingement. Based on a set of constraints that are defined in the analysis, a chosen model will be tested on new designs in order to evaluate their performance. The third part of this dissertation will address the steady-state flow patterns in a Viscosity-Sensitive Fluidic Diode (VSFD). This device is used in a fluidics application, and its originality lies in the fact that it does not require a control fluid in order to operate. This section will discuss the

  16. Numerical analysis of complex fluid-flow systems

    Science.gov (United States)

    Holland, R. L.

    1980-01-01

    Very flexible computer-assisted numerical analysis is used to solve dynamic fluid-flow equations characterizing computer-controlled heat dissipation system developed for Space lab. Losses caused by bends, ties, fittings, valves, and like are easily included, and analysis can solve both steady-state and transient cases. It can also interact with parallel thermal analysis.

  17. GENERALIZED VARIATIONAL OPTIMAZATION ANALYSIS FOR 2-D FLOW FIELD

    Institute of Scientific and Technical Information of China (English)

    HUANG Si-xun; XU Ding-hua; LAN Wei-ren; TENG Jia-jun

    2005-01-01

    The Variational Optimization Analysis Method (VOAM) for 2-D flow field suggested by Sasaki was reviewed first. It is known that the VOAM can be used efficiently in most cases. However, in the cases where there are high frequency noises in 2-D flow field, it appears to be inefficient. In the present paper, based on Sasaki's VOAM, a Generalized Variational Optimization Analysis Method (GVOAM) was proposed with regularization ideas, which could deal well with flow fields containing high frequency noises. A numerical test shows that observational data can be both variationally optimized and filtered, and therefore the GVOAM is an efficient method.

  18. Flow field analysis of high-speed helium turboexpander for cryogenic refrigeration and liquefaction cycles

    Science.gov (United States)

    Sam, Ashish Alex; Ghosh, Parthasarathi

    2017-03-01

    Turboexpander constitutes one of the vital components of Claude cycle based helium refrigerators and liquefiers that are gaining increasing technological importance. These turboexpanders which are of radial inflow in configuration are generally high-speed micro turbines, due to the low molecular weight and density of helium. Any improvement in efficiency of these machines requires a detailed understanding of the flow field. Computational Fluid Dynamics analysis (CFD) has emerged as a necessary tool for the determination of the flow fields in cryogenic turboexpanders, which is often not possible through experiments. In the present work three-dimensional transient flow analysis of a cryogenic turboexpander for helium refrigeration and liquefaction cycles were performed using Ansys CFX®, to understand the flow field of a high-speed helium turboexpander, which in turn will help in taking appropriate decisions regarding modifications of established design methodology for improved efficiency of these machines. The turboexpander is designed based on Balje's nsds diagram and the inverse design blade profile generation formalism prescribed by Hasselgruber and Balje. The analyses include the study of several losses, their origins, the increase in entropy due to these losses, quantification of losses and the effects of various geometrical parameters on these losses. Through the flow field analysis it was observed that in the nozzle, flow separation at the nozzle blade suction side and trailing edge vortices resulted in loss generation, which calls for better nozzle blade profile. The turbine wheel flow field analysis revealed that the significant geometrical parameters of the turbine wheel blade like blade inlet angle, blade profile, tip clearance height and trailing edge thickness need to be optimised for improved performance of the turboexpander. The detailed flow field analysis in this paper can be used to improve the mean line design methodology for turboexpanders used

  19. Numerical simulation of effect of rotational tool with screw on material flow behavior of friction stir welding of Ti6Al4V alloy

    Institute of Scientific and Technical Information of China (English)

    Shude JI; Aili ZOU; Yumei YUE; Guohong LUAN; Yanye JIN; Fu LI

    2012-01-01

    The rotational tool is put forward,which is composed of the one-spiral-flute shoulder and the rotational pin with screw.Using the turbulent model of the FLUENT software,material plastic flow behavior during the process of friction stir welding of Ti6Al4V alloy is researched by the numerical simulation method and then the effect of rotational tool geometry on material flow during the welding process is attained.The results show that the flow direction of the material near the rotational tool is mainly the same as the rotational direction of the tool while the material near tool flows more violently than the other regions.For the tapered rotational pin,the flow velocity of material inside the workpiece decreases with the increase of the distance away from the workpiece surface because of the change of pin diameter.For the rotational tool,the flute added to the shoulder and the screw added to the pin can greatly increase the flow velocity of material during the welding process while the peak value of the flow velocity of material appears on the flute or the screw.Moreover,the rotational tool with the one-spiral-flute shoulder is better than the tool with the concentriccircles-flute shoulder.Decreasing the width of pin screw and increasing the diameter of pin tip are both good for the increase of flow velocity.

  20. Modal and nonmodal stability analysis of electrohydrodynamic flow with and without cross-flow

    CERN Document Server

    Zhang, Mengqi; Wu, Jian; Schmid, Peter J; Quadrio, Maurizio

    2015-01-01

    We report the results of a complete modal and nonmodal linear stability analysis of the electrohydrodynamic flow (EHD) for the problem of electroconvection in the strong injection region. Convective cells are formed by Coulomb force in an insulating liquid residing between two plane electrodes subject to unipolar injection. Besides pure electroconvection, we also consider the case where a cross-flow is present, generated by a streamwise pressure gradient, in the form of a laminar Poiseuille flow. The effect of charge diffusion, often neglected in previous linear stability analyses, is included in the present study and a transient growth analysis, rarely considered in EHD, is carried out. In the case without cross-flow, a non-zero charge diffusion leads to a lower linear stability threshold and thus to a more unstable low. The transient growth, though enhanced by increasing charge diffusion, remains small and hence cannot fully account for the discrepancy of the linear stability threshold between theoretical a...

  1. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  2. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    Science.gov (United States)

    Huang, Lihan

    2014-02-03

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology.

  3. Cellular barcoding tool for clonal analysis in the hematopoietic system.

    Science.gov (United States)

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J C; de Haan, Gerald; Bystrykh, Leonid V

    2010-04-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blotting or polymerase chain reaction-based methods. Although these methods are useful in principle, they generally provide a low-resolution, biased, and incomplete assessment of clonality. To overcome those limitations, we labeled retroviral vectors with random sequence tags or "barcodes." On integration, each vector introduces a unique, identifiable, and heritable mark into the host cell genome, allowing the clonal progeny of each cell to be tracked over time. By coupling the barcoding method to a sequencing-based detection system, we could identify major and minor clones in 2 distinct cell culture systems in vitro and in a long-term transplantation setting. In addition, we demonstrate how clonal analysis can be complemented with transgene expression and integration site analysis. This cellular barcoding tool permits a simple, sensitive assessment of clonality and holds great promise for future gene therapy protocols in humans, and any other applications when clonal tracking is important.

  4. Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling

    Science.gov (United States)

    Liu, D.; Guo, S.; Lian, Y.

    2014-12-01

    Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.

  5. Gradient Flow Analysis on MILC HISQ Ensembles

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nathan [Washington U., St. Louis; Bazavov, Alexei [Brookhaven; Bernard, Claude [Washington U., St. Louis; DeTar, Carleton [Utah U.; Foley, Justin [Utah U.; Gottlieb, Steven [Indiana U.; Heller, Urs M. [APS, New York; Hetrick, J. E. [U. Pacific, Stockton; Komijani, Javad [Washington U., St. Louis; Laiho, Jack [Syracuse U.; Levkova, Ludmila [Utah U.; Oktay, M. B. [Utah U.; Sugar, Robert [UC, Santa Barbara; Toussaint, Doug [Arizona U.; Van de Water, Ruth S. [Fermilab; Zhou, Ran [Fermilab

    2014-11-14

    We report on a preliminary scale determination with gradient-flow techniques on the $N_f = 2 + 1 + 1$ HISQ ensembles generated by the MILC collaboration. The ensembles include four lattice spacings, ranging from 0.15 to 0.06 fm, and both physical and unphysical values of the quark masses. The scales $\\sqrt{t_0}/a$ and $w_0/a$ are computed using Symanzik flow and the cloverleaf definition of $\\langle E \\rangle$ on each ensemble. Then both scales and the meson masses $aM_\\pi$ and $aM_K$ are adjusted for mistunings in the charm mass. Using a combination of continuum chiral perturbation theory and a Taylor series ansatz in the lattice spacing, the results are simultaneously extrapolated to the continuum and interpolated to physical quark masses. Our preliminary results are $\\sqrt{t_0} = 0.1422(7)$fm and $w_0 = 0.1732(10)$fm. We also find the continuum mass-dependence of $w_0$.

  6. Gradient Flow Analysis on MILC HISQ Ensembles

    CERN Document Server

    Bazavov, A; Brown, N; DeTar, C; Foley, J; Gottlieb, Steven; Heller, U M; Hetrick, J E; Komijani, J; Laiho, J; Levkova, L; Oktay, M; Sugar, R L; Toussaint, D; Van de Water, R S; Zhou, R

    2014-01-01

    We report on a preliminary scale determination with gradient-flow techniques on the $N_f = 2 + 1 + 1$ HISQ ensembles generated by the MILC collaboration. The ensembles include four lattice spacings, ranging from 0.15 to 0.06 fm, and both physical and unphysical values of the quark masses. The scales $\\sqrt{t_0}/a$ and $w_0/a$ are computed using Symanzik flow and the cloverleaf definition of $\\langle E \\rangle$ on each ensemble. Then both scales and the meson masses $aM_\\pi$ and $aM_K$ are adjusted for mistunings in the charm mass. Using a combination of continuum chiral perturbation theory and a Taylor series ansatz in the lattice spacing, the results are simultaneously extrapolated to the continuum and interpolated to physical quark masses. Our preliminary results are $\\sqrt{t_0} = 0.1422(7)$fm and $w_0 = 0.1732(10)$fm. We also find the continuum mass-dependence of $w_0$.

  7. Stochastic uncertainty analysis for unconfined flow systems

    Science.gov (United States)

    Liu, Gaisheng; Zhang, Dongxiao; Lu, Zhiming

    2006-01-01

    A new stochastic approach proposed by Zhang and Lu (2004), called the Karhunen-Loeve decomposition-based moment equation (KLME), has been extended to solving nonlinear, unconfined flow problems in randomly heterogeneous aquifers. This approach is on the basis of an innovative combination of Karhunen-Loeve decomposition, polynomial expansion, and perturbation methods. The random log-transformed hydraulic conductivity field (InKS) is first expanded into a series in terms of orthogonal Gaussian standard random variables with their coefficients obtained as the eigenvalues and eigenfunctions of the covariance function of InKS- Next, head h is decomposed as a perturbation expansion series ??A(m), where A(m) represents the mth-order head term with respect to the standard deviation of InKS. Then A(m) is further expanded into a polynomial series of m products of orthogonal Gaussian standard random variables whose coefficients Ai1,i2(m)...,im are deterministic and solved sequentially from low to high expansion orders using MODFLOW-2000. Finally, the statistics of head and flux are computed using simple algebraic operations on Ai1,i2(m)...,im. A series of numerical test results in 2-D and 3-D unconfined flow systems indicated that the KLME approach is effective in estimating the mean and (co)variance of both heads and fluxes and requires much less computational effort as compared to the traditional Monte Carlo simulation technique. Copyright 2006 by the American Geophysical Union.

  8. STRESS ANALYSIS IN CUTTING TOOLS COATED TiN AND EFFECT OF THE FRICTION COEFFICIENT IN TOOL-CHIP INTERFACE

    Directory of Open Access Journals (Sweden)

    Kubilay ASLANTAŞ

    2003-02-01

    Full Text Available The coated tools are regularly used in today's metal cutting industry. Because, it is well known that thin and hard coatings can reduce tool wear, improve tool life and productivity. Such coatings have significantly contributed to the improvements cutting economies and cutting tool performance through lower tool wear and reduced cutting forces. TiN coatings have especially high strength and low friction coefficients. During the cutting process, low friction coefficient reduce damage in cutting tool. In addition, maximum stress values between coating and substrate also decrease as the friction coefficient decreases. In the present study, stress analysis is carried out for HSS (High Speed Steel cutting tool coated with TiN. The effect of the friction coefficient between tool and chip on the stresses developed at the cutting tool surface and interface of coating and HSS is investigated. Damage zones during cutting process was also attempted to determine. Finite elements method is used for the solution of the problem and FRANC2D finite element program is selected for numerical solutions.

  9. Hybrid Information Flow Analysis for Programs with Arrays

    Directory of Open Access Journals (Sweden)

    Gergö Barany

    2016-07-01

    Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.

  10. Lagrangian analysis of fluid transport in empirical vortex ring flows

    OpenAIRE

    Shadden, Shawn C.; Dabiri, John O.; Marsden, Jerrold E.

    2006-01-01

    In this paper we apply dynamical systems analyses and computational tools to fluid transport in empirically measured vortex ring flows. Measurements of quasisteadily propagating vortex rings generated by a mechanical piston-cylinder apparatus reveal lobe dynamics during entrainment and detrainment that are consistent with previous theoretical and numerical studies. In addition, the vortex ring wake of a free-swimming Aurelia aurita jellyfish is measured and analyzed in the framework of dynami...

  11. Performance Analysis of a Fluidic Axial Oscillation Tool for Friction Reduction with the Absence of a Throttling Plate

    Directory of Open Access Journals (Sweden)

    Xinxin Zhang

    2017-04-01

    Full Text Available An axial oscillation tool is proved to be effective in solving problems associated with high friction and torque in the sliding drilling of a complex well. The fluidic axial oscillation tool, based on an output-fed bistable fluidic oscillator, is a type of axial oscillation tool which has become increasingly popular in recent years. The aim of this paper is to analyze the dynamic flow behavior of a fluidic axial oscillation tool with the absence of a throttling plate in order to evaluate its overall performance. In particular, the differences between the original design with a throttling plate and the current default design are profoundly analyzed, and an improvement is expected to be recorded for the latter. A commercial computational fluid dynamics code, Fluent, was used to predict the pressure drop and oscillation frequency of a fluidic axial oscillation tool. The results of the numerical simulations agree well with corresponding experimental results. A sufficient pressure pulse amplitude with a low pressure drop is desired in this study. Therefore, a relative pulse amplitude of pressure drop and displacement are introduced in our study. A comparison analysis between the two designs with and without a throttling plate indicates that when the supply flow rate is relatively low or higher than a certain value, the fluidic axial oscillation tool with a throttling plate exhibits a better performance; otherwise, the fluidic axial oscillation tool without a throttling plate seems to be a preferred alternative. In most of the operating circumstances in terms of the supply flow rate and pressure drop, the fluidic axial oscillation tool performs better than the original design.

  12. Numerical Flow Analysis of a Hydraulic Gear Pump

    Science.gov (United States)

    Panta, Yogendra M.; Kim, Hyun W.; Pierson, Hazel M.

    2007-11-01

    The pressure that exists at the outlet port of a gear pump is a result of system load that was created by a resistance to the fluid flow. However, the flow pattern created inside an external gear pump by the motion of two oppositely rotating gears is deceptively complex, despite the simple geometry of the gear pump. The flow cannot be analyzed, based on a steady-state assumption that is usually employed to analyze turbo-machinery although the flow is essentially steady. Only the time-dependent, transient analysis with moving dynamic meshing technique can predict the motion of the fluid flow against the very high adverse pressure distribution. Although the complexity of analysis is inherent in all positive displacement pumps, gear pumps pose an exceptional challenge in modeling due to the fact that there are two rotating components that are housed within a stationary casing and the gears must be in contact with each other all the time. Fluent, commercially available computational fluid dynamics (CFD) software was used to analyze the flow of the gear pump. The investigation done by CFD produced significant information on flow patterns, velocity and pressure fields, and flow rates.

  13. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  14. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  15. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Parish, Esther S [ORNL; Nugent, Philip J [ORNL

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  16. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  17. Lymphocyte and monocyte flow cytometry immunophenotyping as a diagnostic tool in uncharacteristic inflammatory disorders

    Directory of Open Access Journals (Sweden)

    Grip Olof

    2010-07-01

    Full Text Available Abstract Background Patients with uncharacteristic inflammatory symptoms such as long-standing fatigue or pain, or a prolonged fever, constitute a diagnostic and therapeutic challenge. The aim of the present study was to determine if an extended immunophenotyping of lymphocytes and monocytes including activation markers can define disease-specific patterns, and thus provide valuable diagnostic information for these patients. Methods Whole blood from patients with gram-negative bacteraemia, neuroborreliosis, tuberculosis, acute mononucleosis, influenza or a mixed connective tissue disorders, as diagnosed by routine culture and serology techniques was analysed for lymphocyte and monocyte cell surface markers using a no-wash, no-lyse protocol for multi-colour flow cytometry method. The immunophenotyping included the activation markers HLA-DR and CD40. Plasma levels of soluble TNF alpha receptors were analysed by ELISA. Results An informative pattern was obtained by combining two of the analysed parameters: (i, the fractions of HLA-DR-expressing CD4+ T cells and CD8+ T cells, respectively, and (ii, the level of CD40 on CD14+ CD16- monocytes. Patients infected with gram-negative bacteria or EBV showed a marked increase in monocyte CD40, while this effect was less pronounced for tuberculosis, borrelia and influenza. The bacterial agents could be distinguished from the viral agents by the T cell result; CD4+ T cells reacting in bacterial infection, and the CD8+ T cells dominating for the viruses. Patients with mixed connective tissue disorders also showed increased activation, but with similar engagement of CD4+ and CD8+ T cells. Analysis of soluble TNF alpha receptors was less informative due to a large inter-individual variation. Conclusion Immunophenotyping including the combination of the fractions of HLA-DR expressing T cell subpopulations with the level of CD40 on monocytes produces an informative pattern, differentiating between infections of

  18. Multivariate analysis of bistable flow; Analisis multivariable de flujo biestable

    Energy Technology Data Exchange (ETDEWEB)

    Castillo D, R.; Ortiz V, J.; Ruiz E, J.A. [ININ, 52750 La Marquesa, Estado de Mexico (Mexico); Calleros M, G. [CFE, Alto LUcero, Veracruz (Mexico)]. e-mail: rcd@nuclear.inin.mx

    2007-07-01

    In this work a bistable flow analysis with an autoregressive multivariate analysis is presented. The bistable flow happens in the boiling water nuclear reactors with external recirculation pumps, and it is presented in the bolster of discharge of the recirculation knot toward the central jet pumps. The phenomenon has two flow patterns, one with greater hydraulic lost that the other one. To irregular time intervals, the flow changes pattern in a random way. The program NOISE that it is in development in the ININ was used and that it uses a autoregressive multivariate model to determine the autoregression coefficients that contain the dynamic information of the signals and that later on they are used to obtain the relative contribution of power, which allows to settle down the influence that exists among the different analyzed variables. It was analyzed an event of bistable flow happened in a BWR5 to operation conditions of 80% power and 69% of total flow through the core. The signal flow noise in each one of the 20 jet pumps, of the power of a monitor of power average, of the motive flows of recirculation, of the controllers and of the position of the control valves in the knots, of the signals of the instrumentation of the recirculation pumps (power, current, pressure drop and suction temperature), and of the buses of where they take the feeding voltage the motors of the pumps. Among the main results it was found that the phenomenon of bistable flow affects to the pressure drop in the recirculation pump of the knot in that occur, what affects to the motor flow in the knot by what the opening system of the flow control valve of recirculation of the knot responds. (Author)

  19. GATA: a graphic alignment tool for comparative sequence analysis

    Directory of Open Access Journals (Sweden)

    Nix David A

    2005-01-01

    Full Text Available Abstract Background Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dot plot analysis is often used to estimate non-coding sequence relatedness. Yet dot plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments. Results To address some of these issues, we created a stand alone, platform independent, graphic alignment tool for comparative sequence analysis (GATA http://gata.sourceforge.net/. GATA uses the NCBI-BLASTN program and extensive post-processing to identify all small sub-alignments above a low cut-off score. These are graphed as two shaded boxes, one for each sequence, connected by a line using the coordinate system of their parent sequence. Shading and colour are used to indicate score and orientation. A variety of options exist for querying, modifying and retrieving conserved sequence elements. Extensive gene annotation can be added to both sequences using a standardized General Feature Format (GFF file. Conclusions GATA uses the NCBI-BLASTN program in conjunction with post-processing to exhaustively align two DNA

  20. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  1. Computational Analysis of a Variable Ejector Flow

    Institute of Scientific and Technical Information of China (English)

    H.D. KIM; J.H. LEE; T.SETOGUCHI; S. MATSUO

    2006-01-01

    The present study addresses a variable ejector which can improve the ejector efficiency and control the re-circulation ratio under a fixed operating pressure ratio. The variable ejector is a facility to obtain specific recirculation ratio under a given operating pressure ratio by varying the ejector throat area ratio. The numerical simulations are carried out to provide an understanding of the flow characteristics inside the variable ejector. The sonic and supersonic nozzles are adopted as primary driving nozzles in the ejector system, and a movable cone cylinder, inserted into a conventional ejector-diffuser system, is used to change the ejector throat area ratio. The numerical simulations are based on a fully implicit finite volume scheme of the compressible, Reynolds-Averaged Navier-Stokes equations. The results show that the variable ejector can control the recirculation ratio at a fixed operating pressure ratio.

  2. Program ELM: A tool for rapid thermal-hydraulic analysis of solid-core nuclear rocket fuel elements

    Science.gov (United States)

    Walton, James T.

    1992-01-01

    This report reviews the state of the art of thermal-hydraulic analysis codes and presents a new code, Program ELM, for analysis of fuel elements. ELM is a concise computational tool for modeling the steady-state thermal-hydraulics of propellant flow through fuel element coolant channels in a nuclear thermal rocket reactor with axial coolant passages. The program was developed as a tool to swiftly evaluate various heat transfer coefficient and friction factor correlations generated for turbulent pipe flow with heat addition which have been used in previous programs. Thus, a consistent comparison of these correlations was performed, as well as a comparison with data from the NRX reactor experiments from the Nuclear Engine for Rocket Vehicle Applications (NERVA) project. This report describes the ELM Program algorithm, input/output, and validation efforts and provides a listing of the code.

  3. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  4. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  5. SINEBase: a database and tool for SINE analysis.

    Science.gov (United States)

    Vassetzky, Nikita S; Kramerov, Dmitri A

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis.

  6. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  7. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  8. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  9. Betweenness as a Tool of Vulnerability Analysis of Power System

    Science.gov (United States)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-12-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  10. FACTORIAL CORRESPONDENCES ANALYSIS – A TOOL IN TOURISM MOTIVATION RESEARCH

    Directory of Open Access Journals (Sweden)

    Ion Danut I. JUGANARU

    2016-05-01

    Full Text Available This study aims at analyzing the distribution of tourist flows in 2014, from 25 European countries, on three main categories of trip purposes, and assumes that there are differences or similarities between the tourists’ countries of residence and their trip purposes. "Purpose'' is a multidimensional concept used in marketing research, most often for understanding consumer behavior, and for identifying market segments or customer target groups, reunited in terms of similar characteristics. Being aware that the decision of choice/ purchase is based on purposes, their knowledge proves useful in designing strategies to increase the satisfaction level provided to the customer. The statistical method used in this paper is the factorial correspondences analysis. In our opinion, the identification, by this method, of the existence of differences or similarities between the tourists’ countries of residence and their trip purposes can represent a useful step in studying the tourism market and the choice/ reformulation of strategies.

  11. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  12. System-of-Systems Technology-Portfolio-Analysis Tool

    Science.gov (United States)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  13. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  14. Behavioral analysis of network flow traffic

    OpenAIRE

    Heller, Mark D.

    2010-01-01

    Approved for public release, distribution unlimited Network Behavior Analysis (NBA) is a technique to enhance network security by passively monitoring aggregate traffic patterns and noting unusual action or departures from normal operations. The analysis is typically performed offline, due to the huge volume of input data, in contrast to conventional intrusion prevention solutions based on deep packet inspection, signature detection, and real-time blocking. After establishing a benchmar...

  15. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  16. Multi-tool design and analysis of an automotive HUD

    Science.gov (United States)

    Irving, Bruce; Hasenauer, David; Mulder, Steve

    2016-10-01

    Design and analysis of an optical system is often a multidisciplinary task, and can involve the use of specialized software packages for imaging, mechanics, and illumination. This paper will present a case study on the design and analysis of a basic heads-up display (HUD) for automotive use. The emphasis will be on the special requirements of a HUD visual system and on the tools and techniques needed to accomplish the design. The first section of this paper will present an overview of the imaging design using commercially available imaging design software. Topics addressed in this section include modeling the windshield, visualizing the imaging performance, using constraints and freeform surfaces to improve the system, and meeting specific visual performance specifications with design/analysis methods. The second section will address the use of a CAD program to design a basic mechanical structure to support and protect the optics. This section will also discuss some of the issues and limitations involved in translating data between a CAD program and a lens design or illumination program. Typical issues that arise include the precision of optical surface prescriptions, surface and material properties, and the management of large data files. In the final section, the combined optical and mechanical package will be considered, using an illumination design program for stray light analysis. The stray light analysis will be directed primarily toward finding, visualizing, and quantifying unexpected ray paths. Techniques for sorting optical ray paths by path length, power, and elements or materials encountered will be discussed, along with methods for estimating the impact of stray light on the optical system performance.

  17. FlowFP: A Bioconductor Package for Fingerprinting Flow Cytometric Data

    OpenAIRE

    Wade T. Rogers; Herbert A. Holyst

    2009-01-01

    A new software package called flowFP for the analysis of flow cytometry data is introduced. The package, which is tightly integrated with other Bioconductor software for analysis of flow cytometry, provides tools to transform raw flow cytometry data into a form suitable for direct input into conventional statistical analysis and empirical modeling software tools. The approach of flowFP is to generate a description of the multivariate probability distribution function of flow cytometry data i...

  18. FUNDAMENTAL ANALYSIS AND DISCOUNTED FREE CASH FLOW VALUATION OF STOCKS AT MACEDONIAN STOCK EXCHANGE

    Directory of Open Access Journals (Sweden)

    Nadica Ivanovska

    2014-06-01

    Full Text Available We examine the valuation performance of Discounted Free Cash Flow Model (DFCF at the Macedonian Stock Exchange (MSE in order to determine if this model offer significant level of accuracy and relevancy for stock values determination. We find that stock values calculated with DCF model are very close to average market prices which suggests that market prices oscillate near their fundamental values. We can conclude that DFCF models are useful tools for the companies’ enterprise values calculation on long term. The analysis of our results derived from stock valuation with DFCF model as well as comparison with average market stock prices suggest that discounted cash flow model is relatively reliable valuation tool that have to be used for stocks analyses at MSE.

  19. ASSESSMENT OF PLASTIC FLOWS AND STOCKS IN SERBIA USING MATERIAL FLOW ANALYSIS

    Directory of Open Access Journals (Sweden)

    Goran Vujić

    2010-01-01

    Full Text Available Material flow analysis (MFA was used to assess the amounts of plastic materials flows and stocks that are annually produced, consumed, imported, exported, collected, recycled, and disposed in the landfills in Serbia. The analysis revealed that approximatelly 269,000 tons of plastic materials are directly disposed in uncontrolled landfills in Serbia without any preatretment, and that siginificant amounts of these materials have already accumulated in the landfills. The substantial amounts of landfilled plastics represent not only a loss of valuable recourses, but also pose a seriuos treath to the environment and human health, and if the trend of direct plastic landfilling is continued, Serbia will face with grave consecequnces.

  20. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  1. A Flow-Sensitive Analysis of Privacy Properties

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming

    2007-01-01

    that information I send to some service never is leaked to another service? - unless I give my permission? We shall develop a static program analysis for the pi- calculus and show how it can be used to give privacy guarantees like the ones requested above. The analysis records the explicit information flow...

  2. Experimental resource pulses influence social-network dynamics and the potential for information flow in tool-using crows.

    Science.gov (United States)

    St Clair, James J H; Burns, Zackory T; Bettaney, Elaine M; Morrissey, Michael B; Otis, Brian; Ryder, Thomas B; Fleischer, Robert C; James, Richard; Rutz, Christian

    2015-01-01

    Social-network dynamics have profound consequences for biological processes such as information flow, but are notoriously difficult to measure in the wild. We used novel transceiver technology to chart association patterns across 19 days in a wild population of the New Caledonian crow--a tool-using species that may socially learn, and culturally accumulate, tool-related information. To examine the causes and consequences of changing network topology, we manipulated the environmental availability of the crows' preferred tool-extracted prey, and simulated, in silico, the diffusion of information across field-recorded time-ordered networks. Here we show that network structure responds quickly to environmental change and that novel information can potentially spread rapidly within multi-family communities, especially when tool-use opportunities are plentiful. At the same time, we report surprisingly limited social contact between neighbouring crow communities. Such scale dependence in information-flow dynamics is likely to influence the evolution and maintenance of material cultures.

  3. Multiphase flow in lab on chip devices: A real tool for the future

    NARCIS (Netherlands)

    Shui, Lingling; Pennathur, S.; Pennathur, Sumita; Eijkel, Jan C.T.; van den Berg, Albert

    2008-01-01

    Many applications for lab on a chip (LOC) devices require the use of two or more fluids that are either not chemically related (e.g. oil and water) or in different phases (e.g. liquid and gas). Utilizing multiphase flow in LOC devices allows for both the fundamental study of multiphase flow and the

  4. The Role of Flow Experience and CAD Tools in Facilitating Creative Behaviours for Architecture Design Students

    Science.gov (United States)

    Dawoud, Husameddin M.; Al-Samarraie, Hosam; Zaqout, Fahed

    2015-01-01

    This study examined the role of flow experience in intellectual activity with an emphasis on the relationship between flow experience and creative behaviour in design using CAD. The study used confluence and psychometric approaches because of their unique abilities to depict a clear image of creative behaviour. A cross-sectional study…

  5. The Role of Flow Experience and CAD Tools in Facilitating Creative Behaviours for Architecture Design Students

    Science.gov (United States)

    Dawoud, Husameddin M.; Al-Samarraie, Hosam; Zaqout, Fahed

    2015-01-01

    This study examined the role of flow experience in intellectual activity with an emphasis on the relationship between flow experience and creative behaviour in design using CAD. The study used confluence and psychometric approaches because of their unique abilities to depict a clear image of creative behaviour. A cross-sectional study…

  6. Using digital electronic design flow to create a Genetic Design Automation tool.

    Science.gov (United States)

    Gendrault, Y; Madec, M; Wlotzko, V; Andraud, M; Lallement, C; Haiech, J

    2012-01-01

    Synthetic bio-systems become increasingly more complex and their development is lengthy and expensive. In the same way, in microelectronics, the design process of very complex circuits has benefited from many years of experience. It is now partly automated through Electronic Design Automation tools. Both areas present analogies that can be used to create a Genetic Design Automation tool inspired from EDA tools used in digital electronics. This tool would allow moving away from a totally manual design of bio-systems to assisted conception. This ambitious project is presented in this paper, with a deep focus on the tool that automatically generates models of bio-systems directly usable in electronic simulators.

  7. Run-Time Data-Flow Analysis

    Institute of Scientific and Technical Information of China (English)

    李剑慧; 臧斌宇; 吴蓉; 朱传琪

    2002-01-01

    Parallelizing compilers have made great progress in recent years. However, there still remains a gap between the current ability of parallelizing compilers and their final goals.In order to achieve the maximum parallelism, run-time techniques were used in parallelizing compilers during last few years. First, this paper presents a basic run-time privatization method.The definition of run-time dead code is given and its side effect is discussed. To eliminate the imprecision caused by the run-time dead code, backward data-flow information must be used.Proteus Test, which can use backward information in run-time, is then presented to exploit more dynamic parallelism. Also, a variation of Proteus Test, the Advanced Proteus Test, is offered to achieve partial parallelism. Proteus Test was implemented on the parallelizing compiler AFT.In the end of this paper the program fpppp.f of Spec95fp Benchmark is taken as an example, to show the effectiveness of Proteus Test.

  8. The Landlab v1.0 OverlandFlow component: a Python tool for computing shallow-water flow across watersheds

    Science.gov (United States)

    Adams, Jordan M.; Gasparini, Nicole M.; Hobley, Daniel E. J.; Tucker, Gregory E.; Hutton, Eric W. H.; Nudurupati, Sai S.; Istanbulluoglu, Erkan

    2017-04-01

    Representation of flowing water in landscape evolution models (LEMs) is often simplified compared to hydrodynamic models, as LEMs make assumptions reducing physical complexity in favor of computational efficiency. The Landlab modeling framework can be used to bridge the divide between complex runoff models and more traditional LEMs, creating a new type of framework not commonly used in the geomorphology or hydrology communities. Landlab is a Python-language library that includes tools and process components that can be used to create models of Earth-surface dynamics over a range of temporal and spatial scales. The Landlab OverlandFlow component is based on a simplified inertial approximation of the shallow water equations, following the solution of de Almeida et al.(2012). This explicit two-dimensional hydrodynamic algorithm simulates a flood wave across a model domain, where water discharge and flow depth are calculated at all locations within a structured (raster) grid. Here, we illustrate how the OverlandFlow component contained within Landlab can be applied as a simplified event-based runoff model and how to couple the runoff model with an incision model operating on decadal timescales. Examples of flow routing on both real and synthetic landscapes are shown. Hydrographs from a single storm at multiple locations in the Spring Creek watershed, Colorado, USA, are illustrated, along with a map of shear stress applied on the land surface by flowing water. The OverlandFlow component can also be coupled with the Landlab DetachmentLtdErosion component to illustrate how the non-steady flow routing regime impacts incision across a watershed. The hydrograph and incision results are compared to simulations driven by steady-state runoff. Results from the coupled runoff and incision model indicate that runoff dynamics can impact landscape relief and channel concavity, suggesting that, on landscape evolution timescales, the OverlandFlow model may lead to differences in

  9. Scanning probe microscopy beyond imaging: a general tool for quantitative analysis.

    Science.gov (United States)

    Liscio, Andrea

    2013-04-15

    A simple, fast and general approach for quantitative analysis of scanning probe microscopy (SPM) images is reported. As a proof of concept it is used to determine with a high degree of precision the value of observables such as 1) the height, 2) the flowing current and 3) the corresponding surface potential (SP) of flat nanostructures such as gold electrodes, organic semiconductor architectures and graphenic sheets. Despite histogram analysis, or frequency count (Fc), being the most common mathematical tool used to analyse SPM images, the analytical approach is still lacking. By using the mathematical relationship between Fc and the collected data, the proposed method allows quantitative information on observable values close to the noise level to be gained. For instance, the thickness of nanostructures deposited on very rough substrates can be quantified, and this makes it possible to distinguish the contribution of an adsorbed nanostructure from that of the underlying substrate. Being non-numerical, this versatile analytical approach is a useful and general tool for quantitative analysis of the Fc that enables all signals acquired and recorded by an SPM data array to be studied with high precision.

  10. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    Science.gov (United States)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  11. Sustainable development in the building industry: an analysis and assessment tool for design of disassembly

    Science.gov (United States)

    Graubner, Carl-Alexander; Reiche, Katja

    2001-02-01

    Ecologically Sustainable Development (ESD) has been embraced by governments worldwide and as building plays a key role in development, it is implicated in this movement. Consideration of the whole life cycle of a building is a major aspect, when assessing its sustainability. While the reduction of operating energy and the optimization of building material selection has been a main focus of research in Europe, the consideration of maintenance during operation or the demolition of a building at the end of its life has usually been neglected. Aiming for sustainability the conversation of materials and energy by applying a closed system approach on a long term time scale must be realized. Therefore building materials are to be recycled, building elements are to be reused and buildings are to be more flexible. Designing to facilitate the disassembly of building elements is expected to be an improvement for sustainable buildings. A tool for the assessment of building elements has been developed that focuses on connection selection, its influence on material and energy flow, as well as the quality of building waste materials. The assessment of material production and erection processes, using Life Cycle Assessment is completed with a qualitative/quantitative classification of demolition processes, and disposal scenarios, considering environmental, economic and technical aspects. An analysis of floor elements has confirmed, that Design for Disassembly is very promising for the improvement of sustainable buildings but that improvement potentials can differ considerably. Details of the analysis tool developed and an analysis of building elements will be shown in this article

  12. FLOW TESTING AND ANALYSIS OF THE FSP-1 EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Hawkes, Grant L.; Jones, Warren F.; Marcum, Wade; Weiss, Aaron; Howard, Trevor

    2017-06-01

    The U.S. High Performance Research Reactor Conversions fuel development team is focused on developing and qualifying the uranium-molybdenum (U-Mo) alloy monolithic fuel to support conversion of domestic research reactors to low enriched uranium. Several previous irradiations have demonstrated the favorable behavior of the monolithic fuel. The Full Scale Plate 1 (FSP-1) fuel plate experiment will be irradiated in the northeast (NE) flux trap of the Advanced Test Reactor (ATR). This fueled experiment contains six aluminum-clad fuel plates consisting of monolithic U-Mo fuel meat. Flow testing experimentation and hydraulic analysis have been performed on the FSP-1 experiment to be irradiated in the ATR at the Idaho National Laboratory (INL). A flow test experiment mockup of the FSP-1 experiment was completed at Oregon State University. Results of several flow test experiments are compared with analyses. This paper reports and shows hydraulic analyses are nearly identical to the flow test results. A water velocity of 14.0 meters per second is targeted between the fuel plates. Comparisons between FSP-1 measurements and this target will be discussed. This flow rate dominates the flow characteristics of the experiment and model. Separate branch flows have minimal effect on the overall experiment. A square flow orifice was placed to control the flowrate through the experiment. Four different orifices were tested. A flow versus delta P curve for each orifice is reported herein. Fuel plates with depleted uranium in the fuel meat zone were used in one of the flow tests. This test was performed to evaluate flow test vibration with actual fuel meat densities and reported herein. Fuel plate deformation tests were also performed and reported.

  13. Tre generationer af Flow Injection Analysis

    DEFF Research Database (Denmark)

    Hansen, Elo Harald; Chomchoei, Roongrat; Long, Xiangbao

    2004-01-01

    Siden introduktionen af FIA har en række videreudviklinger resulteret i Sequential Injection Analysis (SIA) og senest i Lab-on-Valve (LOV)-metodikken. Her beskrives disse, og der gives eksempler på, hvordan metoderne kan bruges til at bestemme sporstofkoncentrationer af metaller i komplekse...

  14. Flow Field Analysis of Submerged Horizontal Plate Type Breakwater

    Institute of Scientific and Technical Information of China (English)

    张志强; 栾茂田; 王科

    2013-01-01

    Submerged horizontal plate can be considered as a new concept breakwater. In order to reveal the wave elimination mechanism of this type breakwater, boundary element method is utilized to investigate the velocity field around plate carefully. The flow field analysis shows that the interaction between incident wave and reverse flow caused by submerged plate will lead to the formation of wave elimination area around both sides of the plate. The velocity magnitude of flow field has been reduced and this is the main reason of wave elimination.

  15. Automated High-Dimensional Flow Cytometric Data Analysis

    Science.gov (United States)

    Pyne, Saumyadipta; Hu, Xinli; Wang, Kui; Rossin, Elizabeth; Lin, Tsung-I.; Maier, Lisa; Baecher-Allan, Clare; McLachlan, Geoffrey; Tamayo, Pablo; Hafler, David; de Jager, Philip; Mesirov, Jill

    Flow cytometry is widely used for single cell interrogation of surface and intracellular protein expression by measuring fluorescence intensity of fluorophore-conjugated reagents. We focus on the recently developed procedure of Pyne et al. (2009, Proceedings of the National Academy of Sciences USA 106, 8519-8524) for automated high- dimensional flow cytometric analysis called FLAME (FLow analysis with Automated Multivariate Estimation). It introduced novel finite mixture models of heavy-tailed and asymmetric distributions to identify and model cell populations in a flow cytometric sample. This approach robustly addresses the complexities of flow data without the need for transformation or projection to lower dimensions. It also addresses the critical task of matching cell populations across samples that enables downstream analysis. It thus facilitates application of flow cytometry to new biological and clinical problems. To facilitate pipelining with standard bioinformatic applications such as high-dimensional visualization, subject classification or outcome prediction, FLAME has been incorporated with the GenePattern package of the Broad Institute. Thereby analysis of flow data can be approached similarly as other genomic platforms. We also consider some new work that proposes a rigorous and robust solution to the registration problem by a multi-level approach that allows us to model and register cell populations simultaneously across a cohort of high-dimensional flow samples. This new approach is called JCM (Joint Clustering and Matching). It enables direct and rigorous comparisons across different time points or phenotypes in a complex biological study as well as for classification of new patient samples in a more clinical setting.

  16. Generalized Analysis Tools for Multi-Spacecraft Missions

    Science.gov (United States)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  17. Designing an Exploratory Text Analysis Tool for Humanities and Social Sciences Research

    Science.gov (United States)

    Shrikumar, Aditi

    2013-01-01

    This dissertation presents a new tool for exploratory text analysis that attempts to improve the experience of navigating and exploring text and its metadata. The design of the tool was motivated by the unmet need for text analysis tools in the humanities and social sciences. In these fields, it is common for scholars to have hundreds or thousands…

  18. Bioanalyzer: An Efficient Tool for Sequence Retrieval, Analysis and Manipulation

    Directory of Open Access Journals (Sweden)

    Hassan Tariq

    2010-12-01

    Full Text Available Bioanalyzer provides combination of tools that are never assembled together. Software has list of tools that can be important for different researchers. The aim to develop this kind of software is to provide unique set of tools at one platform in a more efficient and better way than the software or web tools available. It is stand-alone application so it can save time and effort to locate individual tools on net. Flexible design has made it easy to expand it in future. We will make it available publicly soon.

  19. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.;

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... the number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material...

  20. Trade-Space Analysis Tool for Constellations (TAT-C)

    Science.gov (United States)

    Le Moigne, Jacqueline; Dabney, Philip; de Weck, Olivier; Foreman, Veronica; Grogan, Paul; Holland, Matthew; Hughes, Steven; Nag, Sreeja

    2016-01-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: How many spacecraft should be included in the constellation? Which design has the best costrisk value? The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time.This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit Coverage, Reduction Metrics, and Cost Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance.TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of