WorldWideScience

Sample records for flow analysis tool

  1. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  2. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  3. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  4. The entropy concept. A powerful tool for multiphase flow analysis

    International Nuclear Information System (INIS)

    Kolev, Nikolay Ivanov

    2007-01-01

    This work summarizes the system of partial differential equations describing multiphase, multi-component flows in arbitrary geometry including porous structures with arbitrary thermal and mechanical interactions among the fields and between each field and the structure. Each of the fluids is designed as a universal mixture of miscible and immiscible component. The system contains the rigorously derived entropy equations which are used instead of the primitive form of the energy conservation. Based on well established mathematical theorems the equations are local volume and time averaged. The so called volume conservation equation allowing establishing close coupling between pressure and density changes of all of the participating velocity fields is presented. It replaces one of the mass conservation equations. The system is solved within the computer code system IVA together with large number of constitutive relationships for closing it in arbitrary geometry. The extensive validation on many hundreds of simple- and complex experiments, including the many industrial applications, demonstrates the versatility and the power of this analytical tool for designing complex processes in the industry and analyzing complex processes in the nature. (author)

  5. Direct implementation of an axial-flow helium gas turbine tool in a system analysis tool for HTGRs

    International Nuclear Information System (INIS)

    Kim, Ji Hwan; No, Hee Cheon; Kim, Hyeun Min; Lim, Hong Sik

    2008-01-01

    This study concerns the development of dynamic models for a high-temperature gas-cooled reactor (HTGR) through direct implementation of a gas turbine analysis code with a transient analysis code. We have developed a streamline curvature analysis code based on the Newton-Raphson numerical application (SANA) to analyze the off-design performance of helium gas turbines under conditions of normal operation. The SANA code performs a detailed two-dimensional analysis by means of throughflow calculation with allowances for losses in axial-flow multistage compressors and turbines. To evaluate the performance in the steady-state and load transient of HTGRs, we developed GAMMA-T by implementing SANA in the transient system code, GAMMA, which is a multidimensional, multicomponent analysis tool for HTGRs. The reactor, heat exchangers, and connecting pipes were designed with a one-dimensional thermal-hydraulic model that uses the GAMMA code. We assessed GAMMA-T by comparing its results with the steady-state results of the GTHTR300 of JAEA. We concluded that the results are in good agreement, including the results of the vessel cooling bypass flow and the turbine cooling flow

  6. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  7. Python tools for rapid development, calibration, and analysis of generalized groundwater-flow models

    Science.gov (United States)

    Starn, J. J.; Belitz, K.

    2014-12-01

    National-scale water-quality data sets for the United States have been available for several decades; however, groundwater models to interpret these data are available for only a small percentage of the country. Generalized models may be adequate to explain and project groundwater-quality trends at the national scale by using regional scale models (defined as watersheds at or between the HUC-6 and HUC-8 levels). Coast-to-coast data such as the National Hydrologic Dataset Plus (NHD+) make it possible to extract the basic building blocks for a model anywhere in the country. IPython notebooks have been developed to automate the creation of generalized groundwater-flow models from the NHD+. The notebook format allows rapid testing of methods for model creation, calibration, and analysis. Capabilities within the Python ecosystem greatly speed up the development and testing of algorithms. GeoPandas is used for very efficient geospatial processing. Raster processing includes the Geospatial Data Abstraction Library and image processing tools. Model creation is made possible through Flopy, a versatile input and output writer for several MODFLOW-based flow and transport model codes. Interpolation, integration, and map plotting included in the standard Python tool stack also are used, making the notebook a comprehensive platform within on to build and evaluate general models. Models with alternative boundary conditions, number of layers, and cell spacing can be tested against one another and evaluated by using water-quality data. Novel calibration criteria were developed by comparing modeled heads to land-surface and surface-water elevations. Information, such as predicted age distributions, can be extracted from general models and tested for its ability to explain water-quality trends. Groundwater ages then can be correlated with horizontal and vertical hydrologic position, a relation that can be used for statistical assessment of likely groundwater-quality conditions

  8. A spreadsheet tool for the analysis of flows in small-scale water piping networks

    CSIR Research Space (South Africa)

    Adedeji, KB

    2017-07-01

    Full Text Available and the hybrid method to mention but a few, to solve a system of partly linear, and partly non-linear hydraulic equations. In this paper, the authors demonstrate the use of Excel solver to verify the Hardy Cross method for the analysis of flow in water piping...

  9. Using Cognitive Work Analysis to fit decision support tools to nurse managers' work flow.

    Science.gov (United States)

    Effken, Judith A; Brewer, Barbara B; Logue, Melanie D; Gephart, Sheila M; Verran, Joyce A

    2011-10-01

    To better understand the environmental constraints on nurse managers that impact their need for and use of decision support tools, we conducted a Cognitive Work Analysis (CWA). A complete CWA includes system analyses at five levels: work domain, decision-making procedures, decision-making strategies, social organization/collaboration, and worker skill level. Here we describe the results of the Work Domain Analysis (WDA) portion in detail then integrate the WDA with other portions of the CWA, reported previously, to generate a more complete picture of the nurse manager's work domain. Data for the WDA were obtained from semi-structured interviews with nurse managers, division directors, CNOs, and other managers (n = 20) on 10 patient care units in three Arizona hospitals. The WDA described the nurse manager's environment in terms of the constraints it imposes on the nurse manager's ability to achieve targeted outcomes through organizational goals and priorities, functions, processes, as well as work objects and resources (e.g., people, equipment, technology, and data). Constraints were identified and summarized through qualitative thematic analysis. The results highlight the competing priorities, and external and internal constraints that today's nurse managers must satisfy as they try to improve quality and safety outcomes on their units. Nurse managers receive a great deal of data, much in electronic format. Although dashboards were perceived as helpful because they integrated some data elements, no decision support tools were available to help nurse managers with planning or answering "what if" questions. The results suggest both the need for additional decision support to manage the growing complexity of the environment, and the constraints the environment places on the design of that technology if it is to be effective. Limitations of the study include the small homogeneous sample and the reliance on interview data targeting safety and quality. Copyright © 2011

  10. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool.

    Science.gov (United States)

    Aqil, Muhammad; Kita, Ichiro; Yano, Akira; Nishiyama, Soichi

    2007-10-01

    Traditionally, the multiple linear regression technique has been one of the most widely used models in simulating hydrological time series. However, when the nonlinear phenomenon is significant, the multiple linear will fail to develop an appropriate predictive model. Recently, neuro-fuzzy systems have gained much popularity for calibrating the nonlinear relationships. This study evaluated the potential of a neuro-fuzzy system as an alternative to the traditional statistical regression technique for the purpose of predicting flow from a local source in a river basin. The effectiveness of the proposed identification technique was demonstrated through a simulation study of the river flow time series of the Citarum River in Indonesia. Furthermore, in order to provide the uncertainty associated with the estimation of river flow, a Monte Carlo simulation was performed. As a comparison, a multiple linear regression analysis that was being used by the Citarum River Authority was also examined using various statistical indices. The simulation results using 95% confidence intervals indicated that the neuro-fuzzy model consistently underestimated the magnitude of high flow while the low and medium flow magnitudes were estimated closer to the observed data. The comparison of the prediction accuracy of the neuro-fuzzy and linear regression methods indicated that the neuro-fuzzy approach was more accurate in predicting river flow dynamics. The neuro-fuzzy model was able to improve the root mean square error (RMSE) and mean absolute percentage error (MAPE) values of the multiple linear regression forecasts by about 13.52% and 10.73%, respectively. Considering its simplicity and efficiency, the neuro-fuzzy model is recommended as an alternative tool for modeling of flow dynamics in the study area.

  11. S-ProvFlow: provenance model and tools for scalable and adaptive analysis pipelines in geoscience.

    Science.gov (United States)

    Spinuso, A.; Mihajlovski, A.; Atkinson, M.; Filgueira, R.; Klampanos, I.; Sanchez, S.

    2017-12-01

    The reproducibility of scientific findings is essential to improve the quality and application of modern data-driven research. Delivering such reproducibility is challenging in the context of systems handling large data-streams with sophisticated computational methods. Similarly, the SKA (Square Kilometer Array) will collect an unprecedented volume of radio-wave signals that will have to be reduced and transformed into derived products, with impact on space-weather research. This highlights the importance of having cross-disciplines mechanisms at the producer's side that rely on usable lineage data to support validation and traceability of the new artifacts. To be informative, provenance has to describe each methods' abstractions and their implementation as mappings onto distributed platforms and their concurrent execution, capturing relevant internal dependencies at runtime. Producers and intelligent toolsets should be able to exploit the produced provenance, steering real-time monitoring activities and inferring adaptations of methods at runtime.We present a model of provenance (S-PROV) that extends W3C PROV and ProvONE, broadening coverage of provenance to aspects related to distribution, scale-up and steering of stateful streaming operators in analytic pipelines. This is supported by a technical framework for tuneable and actionable lineage, ensuring its relevance to the users' interests, fostering its rapid exploitation to facilitate research practices. By applying concepts such as provenance typing and profiling, users define rules to capture common provenance patterns and activate selective controls based on domain-metadata. The traces are recorded in a document-store with index optimisation and a web API serves advanced interactive tools (S-ProvFlow, https://github.com/KNMI/s-provenance). These allow different classes of consumers to rapidly explore the provenance data. The system, which contributes to the SKA-Link initiative, within technology and

  12. User-friendly tool for power flow analysis and distributed generation ...

    African Journals Online (AJOL)

    The intent of power distribution companies (DISCOs) is to deliver electric power to their ... One major way to minimise power loss on a given power system is to install ... The accuracy and reliability of the developed tool was validated using ...

  13. Spatial Harmonic Decomposition as a tool for unsteady flow phenomena analysis

    International Nuclear Information System (INIS)

    Duparchy, A; Guillozet, J; De Colombel, T; Bornard, L

    2014-01-01

    Hydropower is already the largest single renewable electricity source today but its further development will face new deployment constraints such as large-scale projects in emerging economies and the growth of intermittent renewable energy technologies. The potential role of hydropower as a grid stabilizer leads to operating hydro power plants in ''off-design'' zones. As a result, new methods of analyzing associated unsteady phenomena are needed to improve the design of hydraulic turbines. The key idea of the development is to compute a spatial description of a phenomenon by using a combination from several sensor signals. The spatial harmonic decomposition (SHD) extends the concept of so-called synchronous and asynchronous pulsations by projecting sensor signals on a linearly independent set of a modal scheme. This mathematical approach is very generic as it can be applied on any linear distribution of a scalar quantity defined on a closed curve. After a mathematical description of SHD, this paper will discuss the impact of instrumentation and provide tools to understand SHD signals. Then, as an example of a practical application, SHD is applied on a model test measurement in order to capture and describe dynamic pressure fields. Particularly, the spatial description of the phenomena provides new tools to separate the part of pressure fluctuations that contribute to output power instability or mechanical stresses. The study of the machine stability in partial load operating range in turbine mode or the comparison between the gap pressure field and radial thrust behavior during turbine brake operation are both relevant illustrations of SHD contribution

  14. High Performance Flow Analysis and Control Tools for Aerial Vehicles, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the project is to develop an open architecture, computer aided analysis and control design toolbox for distributed parameter systems, in particular,...

  15. Design of a Domain-Specific Language for Material Flow Analysis using Microsoft DSL tools: An Experience Paper

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2014-01-01

    Material Flow Analysis (MFA) is the procedure of measuring and assessing the mass flows of matter (solid waste, water, food...) and substances (carbon, phosphorus ...) within a process or a system for the period of time. In this paper we propose a Domain-Specific Language (DSL) to model MFA in a ...

  16. Flow Injection/Sequential Injection Analysis Systems: Potential Use as Tools for Rapid Liver Diseases Biomarker Study

    Directory of Open Access Journals (Sweden)

    Supaporn Kradtap Hartwell

    2012-01-01

    Full Text Available Flow injection/sequential injection analysis (FIA/SIA systems are suitable for carrying out automatic wet chemical/biochemical reactions with reduced volume and time consumption. Various parts of the system such as pump, valve, and reactor may be built or adapted from available materials. Therefore the systems can be at lower cost as compared to other instrumentation-based analysis systems. Their applications for determination of biomarkers for liver diseases have been demonstrated in various formats of operation but only a few and limited types of biomarkers have been used as model analytes. This paper summarizes these applications for different types of reactions as a guide for using flow-based systems in more biomarker and/or multibiomarker studies.

  17. Flow chemistry vs. flow analysis.

    Science.gov (United States)

    Trojanowicz, Marek

    2016-01-01

    The flow mode of conducting chemical syntheses facilitates chemical processes through the use of on-line analytical monitoring of occurring reactions, the application of solid-supported reagents to minimize downstream processing and computerized control systems to perform multi-step sequences. They are exactly the same attributes as those of flow analysis, which has solid place in modern analytical chemistry in several last decades. The following review paper, based on 131 references to original papers as well as pre-selected reviews, presents basic aspects, selected instrumental achievements and developmental directions of a rapidly growing field of continuous flow chemical synthesis. Interestingly, many of them might be potentially employed in the development of new methods in flow analysis too. In this paper, examples of application of flow analytical measurements for on-line monitoring of flow syntheses have been indicated and perspectives for a wider application of real-time analytical measurements have been discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  19. Estimating nutrient releases from agriculture in China: An extended substance flow analysis framework and a modeling tool

    International Nuclear Information System (INIS)

    Chen, M.; Chen, J.; Sun, F.

    2010-01-01

    Agriculture related pollution has attracted the attention of policy makers as well as scientists in China as its contribution to water impairment has increased, and quantitative information at the national and regional levels is being sought to support decision making. However, traditional approaches are either time-consuming, expensive (e.g. national surveys) or oversimplified and crude (e.g. coefficient methods). Therefore, this study proposed an extended substance flow analysis (SFA) framework to estimate nutrient releases from agricultural and rural activities in China by depicting the nutrient flows in Chinese agro-ecosystems. The six-step process proposed herein includes: (a) system definition; (b) model development; (c) database development; (d) model validation; (e) results interpretation; and (f) uncertainty analysis. The developed Eubolism (Elementary Unit based nutrient Balance mOdeLIng in agro-ecoSysteM) model combined a nutrient balance module with an emission inventory module to quantify the nutrient flows in the agro-ecosystem. The model was validated and then applied to estimate the total agricultural nutrient loads, identify the contribution of different agricultural and rural activities and different land use types to the total loads, and analyze the spatial pattern of agricultural nutrient emissions in China. These results could provide an entire picture of agricultural pollution at the national level and be used to support policy making. Furthermore, uncertainties associated with the structure of the elementary units, spatial resolution, and inputs/parameters were also analyzed to evaluate the robustness of the model results.

  20. Culvert Analysis Program Graphical User Interface 1.0--A preprocessing and postprocessing tool for estimating flow through culvert

    Science.gov (United States)

    Bradley, D. Nathan

    2013-01-01

    The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output

  1. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  2. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  3. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  4. Signal flow analysis

    CERN Document Server

    Abrahams, J R; Hiller, N

    1965-01-01

    Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther

  5. COMPUTER MODELING IN DEFORM-3D FOR ANALYSIS OF PLASTIC FLOW IN HIGH-SPEED HOT EXTRUSION OF BIMETALLIC FORMATIVE PARTS OF DIE TOOLING

    Directory of Open Access Journals (Sweden)

    I. V. Kachanov

    2015-01-01

    Full Text Available The modern development of industrial production is closely connected with the use of science-based and high technologies to ensure competitiveness of the manufactured products on the world market. There is also much tension around an energy- and resource saving problem which can be solved while introducing new technological processes and  creation of new materials that provide productivity increase through automation and improvement of tool life. Development and implementation of such technologies are rather often considered as time-consuming processes  which are connected with complex calculations and experimental investigations. Implementation of a simulation modelling for materials processing using modern software products serves an alternative to experimental and theoretical methods of research.The aim of this paper is to compare experimental results while obtaining bimetallic samples of a forming tool through the method of speed hot extrusion and the results obtained with the help of computer simulation using DEFORM-3D package and a finite element method. Comparative analysis of plastic flow of real and model samples has shown that the obtained models provide high-quality and reliable picture of plastic flow during high-speed hot extrusion. Modeling in DEFORM-3D make it possible to eliminate complex calculations and significantly reduce a number of experimental studies while developing new technological processes.

  6. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  7. Immunological Tools: Engaging Students in the Use and Analysis of Flow Cytometry and Enzyme-linked Immunosorbent Assay (ELISA)

    Science.gov (United States)

    Ott, Laura E.; Carson, Susan

    2014-01-01

    Flow cytometry and enzyme-linked immunosorbent assay (ELISA) are commonly used techniques associated with clinical and research applications within the immunology and medical fields. The use of these techniques is becoming increasingly valuable in many life science and engineering disciplines as well. Herein, we report the development and…

  8. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  9. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  10. The Cash Flow as Financial Management Tool For Small Businesses

    Directory of Open Access Journals (Sweden)

    Osmar Siena

    2015-06-01

    Full Text Available This study is engaged on the axis of Financial Management, with research into the factors controlling corporation in small business finance. It has as main objective to analyze the cash flow tool as a tool for financial management and specific process to describe the use of the Cash Flow tool objectives; analyze the feasibility of implementing the Cash Flow tool as an instrument of financial management and suggest proposals for suitability for deployment of Cash Flows as a financial management system. Facing these objectives the research uses the precedence of qualitative methodology and applies the instruments on-site visit, interview and questionnaire to collect data. Descriptive analysis that confront the theoretical basis and the data obtained from research is used. With the completion of the analysis the following results were achieved: description of business processes researched; identifying the needs and forms of control currently used and presentation of improvement measures for the adjustment of non-conformities identified. The study contributes to both the academic improvement by analyzing the real situation of the company, as well as it serves as a recommendation to companies embracing similar difficulties in financial management.

  11. Draper Station Analysis Tool

    Science.gov (United States)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  12. Using Crossflow for Flow Measurements and Flow Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gurevich, A.; Chudnovsky, L.; Lopeza, A. [Advanced Measurement and Analysis Group Inc., Ontario (Canada); Park, M. H. [Sungjin Nuclear Engineering Co., Ltd., Gyeongju (Korea, Republic of)

    2016-10-15

    Ultrasonic Cross Correlation Flow Measurements are based on a flow measurement method that is based on measuring the transport time of turbulent structures. The cross correlation flow meter CROSSFLOW is designed and manufactured by Advanced Measurement and Analysis Group Inc. (AMAG), and is used around the world for various flow measurements. Particularly, CROSSFLOW has been used for boiler feedwater flow measurements, including Measurement Uncertainty Recovery (MUR) reactor power uprate in 14 nuclear reactors in the United States and in Europe. More than 100 CROSSFLOW transducers are currently installed in CANDU reactors around the world, including Wolsung NPP in Korea, for flow verification in ShutDown System (SDS) channels. Other CROSSFLOW applications include reactor coolant gross flow measurements, reactor channel flow measurements in all channels in CANDU reactors, boiler blowdown flow measurement, and service water flow measurement. Cross correlation flow measurement is a robust ultrasonic flow measurement tool used in nuclear power plants around the world for various applications. Mathematical modeling of the CROSSFLOW agrees well with laboratory test results and can be used as a tool in determining the effect of flow conditions on CROSSFLOW output and on designing and optimizing laboratory testing, in order to ensure traceability of field flow measurements to laboratory testing within desirable uncertainty.

  13. Hurricane Data Analysis Tool

    Science.gov (United States)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  14. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  15. Usefulness of DC power flow for active power flow analysis with flow controlling devices

    NARCIS (Netherlands)

    Van Hertem, D.; Verboomen, J.; Purchala, K.; Belmans, R.; Kling, W.L.

    2006-01-01

    DC power flow is a commonly used tool for contingency analysis. Recently, due to its simplicity and robustness, it also becomes increasingly used for the real-time dispatch and techno-economic analysis of power systems. It is a simplification of a full power flow looking only at active power.

  16. Time analysis of the containerized cargo flow in the logistic chain using simulation tools: the case of the Port of Seville (Spain)

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz Aguilar, J.J.; Turias, J.I.; Cerban, M.; Gonzalez, M.J.; Pulido, A.

    2016-07-01

    The logistic chain that connects the capital of Spain (Madrid) with the Canary Islands has the Port of Seville as the port node. This port node makes possible to switch from one transport mode (railway) to another (maritime) at the container terminal of the port. Some constraints, such as the operational time window that restricts the freight train access into the port in a certain time-slot or the need of the reversal of the train before entering into port, lead to generate important time delays in the intermodal chain. A time analysis of the process is necessary in order to check the critical points. A simulation of the whole process from the goods departing the origin station by train until they leave the port of Seville by ship to the Canary Islands is performed. To this aim, a queuing model network was developed in order to simulate the travel time of the cargo. The database is composed of daily departures of goods train and daily departures of vessels (including times of docking, berthing or load/unload cargo). The final objective of this work is twofold: firstly, to provide a validated model of the containerized cargo flow and secondly, to demonstrate that this kind of queuing models can become a powerful supporting tool in making decisions about future investments. (Author)

  17. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  18. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  19. NOAA's Inundation Analysis Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...

  20. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  1. Rule-Based Multidisciplinary Tool for Unsteady Reacting Real-Fluid Flows, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — A design and analysis computational tool is proposed for simulating unsteady reacting flows in combustor devices used in reusable launch vehicles. Key aspects...

  2. Security constrained optimal power flow by modern optimization tools

    African Journals Online (AJOL)

    Security constrained optimal power flow by modern optimization tools. ... International Journal of Engineering, Science and Technology ... If you would like more information about how to print, save, and work with PDFs, Highwire Press ...

  3. Channel CAT: A Tactical Link Analysis Tool

    National Research Council Canada - National Science Library

    Coleman, Michael

    1997-01-01

    .... This thesis produced an analysis tool, the Channel Capacity Analysis Tool (Channel CAT), designed to provide an automated tool for the analysis of design decisions in developing client-server software...

  4. Physics Analysis Tools Workshop 2007

    CERN Multimedia

    Elizabeth Gallas,

    The ATLAS PAT (Physics Analysis Tools) group evaluates, develops and tests software tools for the analysis of physics data, consistent with the ATLAS analysis and event data models. Following on from earlier PAT workshops in London (2004), Tucson (2005) and Tokyo (2006), this year's workshop was hosted by the University of Bergen in Norway on April 23-28 with more than 60 participants. The workshop brought together PAT developers and users to discuss the available tools with an emphasis on preparing for data taking. At the start of the week, workshop participants, laptops and power converters in-hand, jumped headfirst into tutorials, learning how to become trigger-aware and how to use grid computing resources via the distributed analysis tools Panda and Ganga. The well organised tutorials were well attended and soon the network was humming, providing rapid results to the users and ample feedback to the developers. A mid-week break was provided by a relaxing and enjoyable cruise through the majestic Norwegia...

  5. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  6. Work flow management systems. Selection of Platforms and tools

    International Nuclear Information System (INIS)

    Munoz Garcia, M.

    1997-01-01

    This paper addresses a formal procedure for selecting the Platform and tools necessary to implement a Work Flow system in a company's organisation. The proposed method is based on a preliminary study to ascertain the company's requirements; in other words, the tool is selected on the basis of the environment in which it is to be used, thus making it essential to know the frequency of use, the types of tasks to be executed, the complexity of work flow, etc. Once the preliminary study has been performed, the formal selection method does no differ greatly from that for selecting any other tool. The objective is to establish a series of weighted parameters so that each candidate configuration can be assessed and one finally selected. Lastly, the paper discusses some practical considerations which became evident during the selection of a work flow management tool for our own company. (Author)

  7. Three-Phase Unbalanced Load Flow Tool for Distribution Networks

    DEFF Research Database (Denmark)

    Demirok, Erhan; Kjær, Søren Bækhøj; Sera, Dezso

    2012-01-01

    This work develops a three-phase unbalanced load flow tool tailored for radial distribution networks based on Matlab®. The tool can be used to assess steady-state voltage variations, thermal limits of grid components and power losses in radial MV-LV networks with photovoltaic (PV) generators where...... most of the systems are single phase. New ancillary service such as static reactive power support by PV inverters can be also merged together with the load flow solution tool and thus, the impact of the various reactive power control strategies on the steady-state grid operation can be simply...... investigated. Performance of the load flow solution tool in the sense of resulting bus voltage magnitudes is compared and validated with IEEE 13-bus test feeder....

  8. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  9. Physics Analysis Tools Workshop Report

    CERN Multimedia

    Assamagan, K A

    A Physics Analysis Tools (PAT) workshop was held at the University of Tokyo in Tokyo Japan on May 15-19, 2006. Unlike the previous ones, this workshop brought together the core PAT developers and ATLAS users. The workshop was attended by 69 people from various institutions: Australia 5 Canada 1 China 6 CERN 4 Europe 7 Japan 32 Taiwan 3 USA 11 The agenda consisted of a 2-day tutorial for users, a 0.5-day user feedback discussion session between users and developers, and a 2-day core PAT workshop devoted to issues in Physics Analysis Tools activities. The tutorial, attended by users and developers, covered the following grounds: Event Selection with the TAG Event Selection Using the Athena-Aware NTuple Event Display Interactive Analysis within ATHENA Distributed Analysis Monte Carlo Truth Tools Trigger-Aware Analysis Event View By many accounts, the tutorial was useful. This workshop was the first time that the ATLAS Asia-Pacific community (Taiwan, Japan, China and Australia) go...

  10. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  11. Subcubic Control Flow Analysis Algorithms

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Van Horn, David

    We give the first direct subcubic algorithm for performing control flow analysis of higher-order functional programs. Despite the long held belief that inclusion-based flow analysis could not surpass the ``cubic bottleneck, '' we apply known set compression techniques to obtain an algorithm...... that runs in time O(n^3/log n) on a unit cost random-access memory model machine. Moreover, we refine the initial flow analysis into two more precise analyses incorporating notions of reachability. We give subcubic algorithms for these more precise analyses and relate them to an existing analysis from...

  12. The laminar flow tube reactor as a quantitative tool for nucleation studies: Experimental results and theoretical analysis of homogeneous nucleation of dibutylphthalate

    International Nuclear Information System (INIS)

    Mikheev, Vladimir B.; Laulainen, Nels S.; Barlow, Stephan E.; Knott, Michael; Ford, Ian J.

    2000-01-01

    A laminar flow tube reactor was designed and constructed to provide an accurate, quantitative measurement of a nucleation rate as a function of supersaturation and temperature. Measurements of nucleation of a supersaturated vapor of dibutylphthalate have been made for the temperature range from -30.3 to +19.1 degree sign C. A thorough analysis of the possible sources of experimental uncertainties (such as defining the correct value of the initial vapor concentration, temperature boundary conditions on the reactor walls, accuracy of the calculations of the thermodynamic parameters of the nucleation zone, and particle concentration measurement) is given. Both isothermal and the isobaric nucleation rates were measured. The experimental data obtained were compared with the measurements of other experimental groups and with theoretical predictions made on the basis of the self-consistency correction nucleation theory. Theoretical analysis, based on the first and the second nucleation theorems, is also presented. The critical cluster size and the excess of internal energy of the critical cluster are obtained. (c) 2000 American Institute of Physics

  13. Cluster Flow: A user-friendly bioinformatics workflow tool [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Philip Ewels

    2016-12-01

    Full Text Available Pipeline tools are becoming increasingly important within the field of bioinformatics. Using a pipeline manager to manage and run workflows comprised of multiple tools reduces workload and makes analysis results more reproducible. Existing tools require significant work to install and get running, typically needing pipeline scripts to be written from scratch before running any analysis. We present Cluster Flow, a simple and flexible bioinformatics pipeline tool designed to be quick and easy to install. Cluster Flow comes with 40 modules for common NGS processing steps, ready to work out of the box. Pipelines are assembled using these modules with a simple syntax that can be easily modified as required. Core helper functions automate many common NGS procedures, making running pipelines simple. Cluster Flow is available with an GNU GPLv3 license on GitHub. Documentation, examples and an online demo are available at http://clusterflow.io.

  14. Information Flow Analysis for VHDL

    DEFF Research Database (Denmark)

    Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire...... information flow through the VHDL program. The result of the analysis is presented as a non-transitive directed graph that connects those nodes (representing either variables or signals) where an information flow might occur. We compare our approach to that of Kemmerer and conclude that our approach yields...

  15. Flow Injection Analysis

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    1998-01-01

    Learning objectives:* To provide an introduction to automated assays* To describe the basic principles of FIA * To demonstrate the capabilities of FIA in relation to batch assays and conventional continuous flow systems* To show that FIA allows one to augment existing analytical techniques* To sh...... how FIA offers novel analytical procedures which are not feasible by conventional means* To hightlight the potentials of FIA in selected practical assays...

  16. Channel CAT: A Tactical Link Analysis Tool

    Science.gov (United States)

    1997-09-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL by Michael Glenn Coleman September 1997 Thesis...REPORT TYPE AND DATES COVERED September 1997 Master’s Thesis 4. TITLE AND SUBTITLE CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL 5. FUNDING NUMBERS 6...tool, the Channel Capacity Analysis Tool (Channel CAT ), designed to provide an automated tool for the anlysis of design decisions in developing client

  17. Waste flow analysis and life cycle assessment of integrated waste management systems as planning tools: Application to optimise the system of the City of Bologna.

    Science.gov (United States)

    Tunesi, Simonetta; Baroni, Sergio; Boarini, Sandro

    2016-09-01

    The results of this case study are used to argue that waste management planning should follow a detailed process, adequately confronting the complexity of the waste management problems and the specificity of each urban area and of regional/national situations. To support the development or completion of integrated waste management systems, this article proposes a planning method based on: (1) the detailed analysis of waste flows and (2) the application of a life cycle assessment to compare alternative scenarios and optimise solutions. The evolution of the City of Bologna waste management system is used to show how this approach can be applied to assess which elements improve environmental performance. The assessment of the contribution of each waste management phase in the Bologna integrated waste management system has proven that the changes applied from 2013 to 2017 result in a significant improvement of the environmental performance mainly as a consequence of the optimised integration between materials and energy recovery: Global Warming Potential at 100 years (GWP100) diminishes from 21,949 to -11,169 t CO2-eq y(-1) and abiotic resources depletion from -403 to -520 t antimony-eq. y(-1) This study analyses at great detail the collection phase. Outcomes provide specific operational recommendations to policy makers, showing the: (a) relevance of the choice of the materials forming the bags for 'door to door' collection (for non-recycled low-density polyethylene bags 22 kg CO2-eq (tonne of waste)(-1)); (b) relatively low environmental impacts associated with underground tanks (3.9 kg CO2-eq (tonne of waste)(-1)); (c) relatively low impact of big street containers with respect to plastic bags (2.6 kg CO2-eq. (tonne of waste)(-1)). © The Author(s) 2016.

  18. Computational fluid dynamics analysis of a mixed flow pump impeller

    African Journals Online (AJOL)

    ATHARVA

    International Journal of Engineering, Science and Technology ... From the CFD analysis software and advanced post processing tools the complex flow inside the ... The numerical simulation can provide quite accurate information on the fluid ...

  19. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  20. Flows method in global analysis

    International Nuclear Information System (INIS)

    Duong Minh Duc.

    1994-12-01

    We study the gradient flows method for W r,p (M,N) where M and N are Riemannian manifold and r may be less than m/p. We localize some global analysis problem by constructing gradient flows which only change the value of any u in W r,p (M,N) in a local chart of M. (author). 24 refs

  1. Empirical flow parameters : a tool for hydraulic model validity

    Science.gov (United States)

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  2. Buck Creek River Flow Analysis

    Science.gov (United States)

    Dhanapala, Yasas; George, Elizabeth; Ritter, John

    2009-04-01

    Buck Creek flowing through Springfield Ohio has a number of low-head dams currently in place that cause safety issues and sometimes make it impossible for recreational boaters to pass through. The safety issues include the back eddies created by the dams that are known as drowning machines and the hydraulic jumps. In this study we are modeling the flow of Buck Creek using topographical and flow data provided by the Geology Department of Wittenberg University. The flow is analyzed using Hydraulic Engineering Center - River Analysis System software (HEC-RAS). As the first step a model of the river near Snyder Park has been created with the current structure in place for validation purposes. Afterwards the low-head dam is replaced with four drop structures with V-notch overflow gates. The river bed is altered to reflect plunge pools after each drop structure. This analysis will provide insight to how the flow is going to behave after the changes are made. In addition a sediment transport analysis is also being conducted to provide information about the stability of these structures.

  3. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  4. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  5. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  6. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  7. STRING 3: An Advanced Groundwater Flow Visualization Tool

    Science.gov (United States)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  8. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  9. Flow analysis of HANARO flow simulated test facility

    International Nuclear Information System (INIS)

    Park, Yong-Chul; Cho, Yeong-Garp; Wu, Jong-Sub; Jun, Byung-Jin

    2002-01-01

    The HANARO, a multi-purpose research reactor of 30 MWth open-tank-in-pool type, has been under normal operation since its initial critical in February, 1995. Many experiments should be safely performed to activate the utilization of the NANARO. A flow simulated test facility is being developed for the endurance test of reactivity control units for extended life times and the verification of structural integrity of those experimental facilities prior to loading in the HANARO. This test facility is composed of three major parts; a half-core structure assembly, flow circulation system and support system. The half-core structure assembly is composed of plenum, grid plate, core channel with flow tubes, chimney and dummy pool. The flow channels are to be filled with flow orifices to simulate core channels. This test facility must simulate similar flow characteristics to the HANARO. This paper, therefore, describes an analytical analysis to study the flow behavior of the test facility. The computational flow analysis has been performed for the verification of flow structure and similarity of this test facility assuming that flow rates and pressure differences of the core channel are constant. The shapes of flow orifices were determined by the trial and error method based on the design requirements of core channel. The computer analysis program with standard k - ε turbulence model was applied to three-dimensional analysis. The results of flow simulation showed a similar flow characteristic with that of the HANARO and satisfied the design requirements of this test facility. The shape of flow orifices used in this numerical simulation can be adapted for manufacturing requirements. The flow rate and the pressure difference through core channel proved by this simulation can be used as the design requirements of the flow system. The analysis results will be verified with the results of the flow test after construction of the flow system. (author)

  10. CASH FLOW-FINANCIAL PLANNING TOOL IN THE TOURISM UNITS

    Directory of Open Access Journals (Sweden)

    Boby COSTI

    2017-05-01

    Full Text Available This paper addresses the issue of cash-flow tool for financial planning, cash flow calculation by applying the indirect method within a company in the field of tourism. Thus, it tackled issues concerning the organization of accounting of treasury which made a presentation of the subject reflected in treasury accounting record of the existence and movement of securities placement, availability of the accounts at banks, short term bank loans and other cash values. The importance of access to information as well as more detailed and clearer leads to an upward trend of the tourism society. Standardization of definitions helps to ensure that all parties are talking of the same terms or concepts with lower or no variables. This is essential for developers and contractors in different geographical regions of the world and different countries when they discuss issues of tourism and travel.

  11. Protein analysis tools and services at IBIVU

    Directory of Open Access Journals (Sweden)

    Brandt Bernd W.

    2011-06-01

    Full Text Available During the last years several new tools applicable to protein analysis have made available on the IBIVU web site. Recently, a number of tools, ranging from multiple sequence alignment construction to domain prediction, have been updated and/or extended with services for programmatic access using SOAP. We provide an overview of these tools and their application.

  12. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  13. Affordances of agricultural systems analysis tools

    NARCIS (Netherlands)

    Ditzler, Lenora; Klerkx, Laurens; Chan-Dentoni, Jacqueline; Posthumus, Helena; Krupnik, Timothy J.; Ridaura, Santiago López; Andersson, Jens A.; Baudron, Frédéric; Groot, Jeroen C.J.

    2018-01-01

    The increasingly complex challenges facing agricultural systems require problem-solving processes and systems analysis (SA) tools that engage multiple actors across disciplines. In this article, we employ the theory of affordances to unravel what tools may furnish users, and how those affordances

  14. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  15. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov (United States)

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  16. Post-Flight Data Analysis Tool

    Science.gov (United States)

    George, Marina

    2018-01-01

    A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.

  17. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  18. Microparticle tracking velocimetry as a tool for microfluidic flow measurements

    Science.gov (United States)

    Salipante, Paul; Hudson, Steven D.; Schmidt, James W.; Wright, John D.

    2017-07-01

    The accurate measurement of flows in microfluidic channels is important for commercial and research applications. We compare the accuracy of flow measurement techniques over a wide range flows. Flow measurements made using holographic microparticle tracking velocimetry (µPTV) and a gravimetric flow standard over the range of 0.5-100 nL/s agree within 0.25%, well within the uncertainty of the two flow systems. Two commercial thermal flow sensors were used as the intermediaries (transfer standards) between the two flow measurement systems. The gravimetric flow standard was used to calibrate the thermal flow sensors by measuring the rate of change of the mass of liquid in a beaker on a micro-balance as it fills. The holographic µPTV flow measurements were made in a rectangular channel and the flow was seeded with 1 µm diameter polystyrene spheres. The volumetric flow was calculated using the Hagen-Pouiseille solution for a rectangular channel. The uncertainty of both flow measurement systems is given. For the gravimetric standard, relative uncertainty increased for decreasing flows due to surface tension forces between the pipette carrying the flow and the free surface of the liquid in the beaker. The uncertainty of the holographic µPTV measurements did not vary significantly over the measured flow range, and thus comparatively are especially useful at low flow velocities.

  19. The RUBA Watchdog Video Analysis Tool

    DEFF Research Database (Denmark)

    Bahnsen, Chris Holmberg; Madsen, Tanja Kidholm Osmann; Jensen, Morten Bornø

    We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA.......We have developed a watchdog video analysis tool called RUBA (Road User Behaviour Analysis) to use for processing of traffic video. This report provides an overview of the functions of RUBA and gives a brief introduction into how analyses can be made in RUBA....

  20. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  1. PIE Nacelle Flow Analysis and TCA Inlet Flow Quality Assessment

    Science.gov (United States)

    Shieh, C. F.; Arslan, Alan; Sundaran, P.; Kim, Suk; Won, Mark J.

    1999-01-01

    This presentation includes three topics: (1) Analysis of isolated boattail drag; (2) Computation of Technology Concept Airplane (TCA)-installed nacelle effects on aerodynamic performance; and (3) Assessment of TCA inlet flow quality.

  2. Photogrammetry Tool for Forensic Analysis

    Science.gov (United States)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  3. Clustering and Flow Conservation Monitoring Tool for Software Defined Networks

    Directory of Open Access Journals (Sweden)

    Jesús Antonio Puente Fernández

    2018-04-01

    Full Text Available Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN is a new concept of network architecture that provides the separation of control plane (controller and data plane (switches in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches.

  4. Clustering and Flow Conservation Monitoring Tool for Software Defined Networks.

    Science.gov (United States)

    Puente Fernández, Jesús Antonio; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-04-03

    Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches.

  5. A GIS-based Computational Tool for Multidimensional Flow Velocity by Acoustic Doppler Current Profilers

    International Nuclear Information System (INIS)

    Kim, D; Winkler, M; Muste, M

    2015-01-01

    Acoustic Doppler Current Profilers (ADCPs) provide efficient and reliable flow measurements compared to other tools for characteristics of the riverine environments. In addition to originally targeted discharge measurements, ADCPs are increasingly utilized to assess river flow characteristics. The newly developed VMS (Velocity Mapping Software) aims at providing an efficient process for quality assurance, mapping velocity vectors for visualization and facilitating comparison with physical and numerical model results. VMS was designed to provide efficient and smooth work flows for processing groups of transects. The software allows the user to select group of files and subsequently to conduct statistical and graphical quality assurance on the files as a group or individually as appropriate. VMS also enables spatial averaging in horizontal and vertical plane for ADCP data in a single or multiple transects over the same or consecutive cross sections. The analysis results are displayed in numerical and graphical formats. (paper)

  6. Multifractal Analysis for the Teichmueller Flow

    Energy Technology Data Exchange (ETDEWEB)

    Meson, Alejandro M., E-mail: meson@iflysib.unlp.edu.ar; Vericat, Fernando, E-mail: vericat@iflysib.unlp.edu.ar [Instituto de Fisica de Liquidos y Sistemas Biologicos (IFLYSIB) CCT-CONICET, La Plata-UNLP and Grupo de Aplicaciones Matematicas y Estadisticas de la Facultad de Ingenieria (GAMEFI) UNLP (Argentina)

    2012-03-15

    We present a multifractal description for Teichmueller flows. A key ingredient to do this is the Rauzy-Veech-Zorich reduction theory, which allows to treat the problem in the setting of suspension flows over subshifts. To perform the multifractal analysis we implement a thermodynamic formalism for suspension flows over countable alphabet subshifts a bit different from that developed by Barreira and Iommi.

  7. SBAT. A stochastic BPMN analysis tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  8. Random signal tomographical analysis of two-phase flow

    International Nuclear Information System (INIS)

    Han, P.; Wesser, U.

    1990-01-01

    This paper reports on radiation tomography which is a useful tool for studying the internal structures of two-phase flow. However, general tomography analysis gives only time-averaged results, hence much information is lost. As a result, it is sometimes difficult to identify the flow regime; for example, the time-averaged picture does not significantly change as an annual flow develops from a slug flow. A two-phase flow diagnostic technique based on random signal tomographical analysis is developed. It extracts more information by studying the statistical variation of the measured signal with time. Local statistical parameters, including mean value, variance, skewness and flatness etc., are reconstructed from the information obtained by a general tomography technique. More important information are provided by the results. Not only the void fraction can be easily calculated, but also the flow pattern can be identified more objectively and more accurately. The experimental setup is introduced. It consisted of a two-phase flow loop, an X-ray system, a fan-like five-beam detector system and a signal acquisition and processing system. In the experiment, for both horizontal and vertical test sections (aluminum and steel tube with Di/Do = 40/45 mm), different flow situations are realized by independently adjusting air and water mass flow. Through a glass tube connected with the test section, some typical flow patterns are visualized and used for comparing with the reconstruction results

  9. Intuitive Visualization of Transient Flow: Towards a Full 3D Tool

    Science.gov (United States)

    Michel, Isabel; Schröder, Simon; Seidel, Torsten; König, Christoph

    2015-04-01

    . Currently STRING can generate animations of single 2D cuts, either planar or curved surfaces, through 3D simulation domains. To provide a general tool for experts enabling also direct exploration and analysis of large 3D flow fields the software needs to be extended to intuitive as well as interactive visualizations of entire 3D flow domains. The current research concerning this project, which is funded by the Federal Ministry for Economic Affairs and Energy (Germany), is presented.

  10. ANALYSIS AND ACCOUNTING OF TOTAL CASH FLOW

    Directory of Open Access Journals (Sweden)

    MELANIA ELENA MICULEAC

    2012-01-01

    Full Text Available In order to reach the objective of supplying some relevant information regarding the liquidity inflows and outflows during a financial exercise, the total cash flow analysis must include the analysis of result cashable from operation, of payments and receipts related to the investment and of financing decisions of the last exercise, as well as the analysis of treasury variation (of cash items. The management of total cash flows ensures the correlation of current liquidness flows as consequence of receipts with the payments ’flows, in order to provide payment continuity of mature obligations.

  11. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  12. Basic Functional Analysis Puzzles of Spectral Flow

    DEFF Research Database (Denmark)

    Booss-Bavnbek, Bernhelm

    2011-01-01

    We explain an array of basic functional analysis puzzles on the way to general spectral flow formulae and indicate a direction of future topological research for dealing with these puzzles.......We explain an array of basic functional analysis puzzles on the way to general spectral flow formulae and indicate a direction of future topological research for dealing with these puzzles....

  13. Modular Control Flow Analysis for Libraries

    DEFF Research Database (Denmark)

    Probst, Christian W.

    2002-01-01

    One problem in analyzing object oriented languages is that the exact control flow graph is not known statically due to dynamic dispatching. However, this is needed in order to apply the large class of known interprocedural analysis. Control Flow Analysis in the object oriented setting aims...

  14. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  15. Analysis tools for discovering strong parity violation at hadron colliders

    International Nuclear Information System (INIS)

    Backovic, Mihailo; Ralston, John P.

    2011-01-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or 'azimuthal flow'. Analysis uses the representations of the orthogonal group O(2) and dihedral groups D N necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single 'reaction plane'. Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of 'event-shape sorting' to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  16. Paramedir: A Tool for Programmable Performance Analysis

    Science.gov (United States)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  17. Content analysis in information flows

    Energy Technology Data Exchange (ETDEWEB)

    Grusho, Alexander A. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation); Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow (Russian Federation); Grusho, Nick A.; Timonina, Elena E. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation)

    2016-06-08

    The paper deals with architecture of content recognition system. To analyze the problem the stochastic model of content recognition in information flows was built. We proved that under certain conditions it is possible to solve correctly a part of the problem with probability 1, viewing a finite section of the information flow. That means that good architecture consists of two steps. The first step determines correctly certain subsets of contents, while the second step may demand much more time for true decision.

  18. The CANDU alarm analysis tool (CAAT)

    Energy Technology Data Exchange (ETDEWEB)

    Davey, E C; Feher, M P; Lupton, L R [Control Centre Technology Branch, ON (Canada)

    1997-09-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs.

  19. The CANDU alarm analysis tool (CAAT)

    International Nuclear Information System (INIS)

    Davey, E.C.; Feher, M.P.; Lupton, L.R.

    1997-01-01

    AECL undertook the development of a software tool to assist alarm system designers and maintainers based on feedback from several utilities and design groups. The software application is called the CANDU Alarm Analysis Tool (CAAT) and is being developed to: Reduce by one half the effort required to initially implement and commission alarm system improvements; improve the operational relevance, consistency and accuracy of station alarm information; record the basis for alarm-related decisions; provide printed reports of the current alarm configuration; and, make day-to-day maintenance of the alarm database less tedious and more cost-effective. The CAAT assists users in accessing, sorting and recording relevant information, design rules, decisions, and provides reports in support of alarm system maintenance, analysis of design changes, or regulatory inquiry. The paper discusses the need for such a tool, outlines the application objectives and principles used to guide tool development, describes the how specific tool features support user design and maintenance tasks, and relates the lessons learned from early application experience. (author). 4 refs, 2 figs

  20. Gaseous slip flow analysis of a micromachined flow sensor for ultra small flow applications

    OpenAIRE

    Jang, Jaesung; Wereley, Steven

    2007-01-01

    The velocity slip of a fluid at a wall is one of the most typical phenomena in microscale gas flows. This paper presents a flow analysis considering the velocity slip in a capacitive micro gas flow sensor based on pressure difference measurements along a microchannel. The tangential momentum accommodation coefficient (TMAC) measurements of a particular channel wall in planar microchannels will be presented while the previous micro gas flow studies have been based on the same TMACs on both wal...

  1. Decision Analysis Tools for Volcano Observatories

    Science.gov (United States)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  2. Robust-mode analysis of hydrodynamic flows

    Science.gov (United States)

    Roy, Sukesh; Gord, James R.; Hua, Jia-Chen; Gunaratne, Gemunu H.

    2017-04-01

    The emergence of techniques to extract high-frequency high-resolution data introduces a new avenue for modal decomposition to assess the underlying dynamics, especially of complex flows. However, this task requires the differentiation of robust, repeatable flow constituents from noise and other irregular features of a flow. Traditional approaches involving low-pass filtering and principle components analysis have shortcomings. The approach outlined here, referred to as robust-mode analysis, is based on Koopman decomposition. Three applications to (a) a counter-rotating cellular flame state, (b) variations in financial markets, and (c) turbulent injector flows are provided.

  3. Two-phase flow characteristics analysis code: MINCS

    International Nuclear Information System (INIS)

    Watanabe, Tadashi; Hirano, Masashi; Akimoto, Masayuki; Tanabe, Fumiya; Kohsaka, Atsuo.

    1992-03-01

    Two-phase flow characteristics analysis code: MINCS (Modularized and INtegrated Code System) has been developed to provide a computational tool for analyzing two-phase flow phenomena in one-dimensional ducts. In MINCS, nine types of two-phase flow models-from a basic two-fluid nonequilibrium (2V2T) model to a simple homogeneous equilibrium (1V1T) model-can be used under the same numerical solution method. The numerical technique is based on the implicit finite difference method to enhance the numerical stability. The code structure is highly modularized, so that new constitutive relations and correlations can be easily implemented into the code and hence evaluated. A flow pattern can be fixed regardless of flow conditions, and state equations or steam tables can be selected. It is, therefore, easy to calculate physical or numerical benchmark problems. (author)

  4. Computational analysis of the flow field downstream of flow conditioners

    Energy Technology Data Exchange (ETDEWEB)

    Erdal, Asbjoern

    1997-12-31

    Technological innovations are essential for maintaining the competitiveness for the gas companies and here metering technology is one important area. This thesis shows that computational fluid dynamic techniques can be a valuable tool for examination of several parameters that may affect the performance of a flow conditioner (FC). Previous design methods, such as screen theory, could not provide fundamental understanding of how a FC works. The thesis shows, among other things, that the flow pattern through a complex geometry, like a 19-hole plate FC, can be simulated with good accuracy by a k-{epsilon} turbulence model. The calculations illuminate how variations in pressure drop, overall porosity, grading of porosity across the cross-section and the number of holes affects the performance of FCs. These questions have been studied experimentally by researchers for a long time. Now an understanding of the important mechanisms behind efficient FCs emerges from the predictions. 179 ref., 110 figs., 8 tabs.

  5. Flow Analysis for the Falkner–Skan Wedge Flow

    DEFF Research Database (Denmark)

    Bararnia, H; Haghparast, N; Miansari, M

    2012-01-01

    In this article an analytical technique, namely the homotopy analysis method (HAM), is applied to solve the momentum and energy equations in the case of a two-dimensional incompressible flow passing over a wedge. The trail and error method and Padé approximation strategies have been used to obtai...

  6. Channel flow analysis. [velocity distribution throughout blade flow field

    Science.gov (United States)

    Katsanis, T.

    1973-01-01

    The design of a proper blade profile requires calculation of the blade row flow field in order to determine the velocities on the blade surfaces. An analysis theory is presented for several methods used for this calculation and associated computer programs that were developed are discussed.

  7. flowAI: automatic and interactive anomaly discerning tools for flow cytometry data.

    Science.gov (United States)

    Monaco, Gianni; Chen, Hao; Poidinger, Michael; Chen, Jinmiao; de Magalhães, João Pedro; Larbi, Anis

    2016-08-15

    Flow cytometry (FCM) is widely used in both clinical and basic research to characterize cell phenotypes and functions. The latest FCM instruments analyze up to 20 markers of individual cells, producing high-dimensional data. This requires the use of the latest clustering and dimensionality reduction techniques to automatically segregate cell sub-populations in an unbiased manner. However, automated analyses may lead to false discoveries due to inter-sample differences in quality and properties. We present an R package, flowAI, containing two methods to clean FCM files from unwanted events: (i) an automatic method that adopts algorithms for the detection of anomalies and (ii) an interactive method with a graphical user interface implemented into an R shiny application. The general approach behind the two methods consists of three key steps to check and remove suspected anomalies that derive from (i) abrupt changes in the flow rate, (ii) instability of signal acquisition and (iii) outliers in the lower limit and margin events in the upper limit of the dynamic range. For each file analyzed our software generates a summary of the quality assessment from the aforementioned steps. The software presented is an intuitive solution seeking to improve the results not only of manual but also and in particular of automatic analysis on FCM data. R source code available through Bioconductor: http://bioconductor.org/packages/flowAI/ CONTACTS: mongianni1@gmail.com or Anis_Larbi@immunol.a-star.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  9. The analysis of exergy and cash flow

    International Nuclear Information System (INIS)

    Weimin, H.

    1989-01-01

    The paper presents the analysis of the economic content of exergy parameter and the thermodynamical analogy of the analysis of cash flow, and gives out the reasonable foundations of the analysis of heat economy. The thoughts of optimum design of the combination of heat economic analysis and investment policy are also put forward

  10. Security constrained optimal power flow by modern optimization tools

    African Journals Online (AJOL)

    The main objective of an optimal power flow (OPF) functions is to optimize .... It is characterized as propagation of plants and this happens by gametes union. ... ss and different variables, for example, wind, nearby fertilization can have a critic.

  11. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  12. Flow Injection Analysis in Industrial Biotechnology

    DEFF Research Database (Denmark)

    Hansen, Elo Harald; Miró, Manuel

    2009-01-01

    Flow injection analysis (FIA) is an analytical chemical continuous-flow (CF) method which in contrast to traditional CF-procedures does not rely on complete physical mixing (homogenisation) of the sample and the reagent(s) or on attaining chemical equilibria of the chemical reactions involved. Ex...

  13. GenFlow: generic flow for integration, management and analysis of molecular biology data

    Directory of Open Access Journals (Sweden)

    Marcio Katsumi Oikawa

    2004-01-01

    Full Text Available A large number of DNA sequencing projects all over the world have yielded a fantastic amount of data, whose analysis is, currently, a big challenge for computational biology. The limiting step in this task is the integration of large volumes of data stored in highly heterogeneous repositories of genomic and cDNA sequences, as well as gene expression results. Solving this problem requires automated analytical tools to optimize operations and efficiently generate knowledge. This paper presents an information flow model , called GenFlow, that can tackle this analytical task.

  14. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  15. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  16. Modified and reverse radiometric flow injection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Myint, U; Ba, H; Khin, M M; Aung, K; Thida, [Yangon Univ. (Myanmar). Dept. of Chemistry; Toelgyessy, J [Slovak Technical Univ., Bratislava (Slovakia). Dept. of Environmental Science

    1994-06-01

    Determination of [sup 137]Cs and [sup 60]Co by using modified and reverse radiometric flow injection analysis is described. Two component RFIA was also realized using [sup 60]Co and [sup 137]Cs radionuclides. (author) 2 refs.; 5 figs.

  17. Analysis of stratified flow mixing

    International Nuclear Information System (INIS)

    Soo, S.L.; Lyczkowski, R.W.

    1985-01-01

    The Creare 1/5-scale Phase II experiments which model fluid and thermal mixing of relatively cold high pressure injection (HPI) water into a cold leg of a full-scale pressurized water reactor (PWR) having loop flow are analyzed and found that they cannot achieve complete similarity with respect to characteristic Reynolds and Froude numbers and developing hydrodynamic entry length. Several analyses show that these experiments fall into two distinct regimes of mixing: momentum controlled and gravity controlled (stratification). 18 refs., 9 figs

  18. Debris flow early warning systems in Norway: organization and tools

    Science.gov (United States)

    Kleivane, I.; Colleuille, H.; Haugen, L. E.; Alve Glad, P.; Devoli, G.

    2012-04-01

    In Norway, shallow slides and debris flows occur as a combination of high-intensity precipitation, snowmelt, high groundwater level and saturated soil. Many events have occurred in the last decades and are often associated with (or related to) floods events, especially in the Southern of Norway, causing significant damages to roads, railway lines, buildings, and other infrastructures (i.e November 2000; August 2003; September 2005; November 2005; Mai 2008; June and Desember 2011). Since 1989 the Norwegian Water Resources and Energy Directorate (NVE) has had an operational 24 hour flood forecasting system for the entire country. From 2009 NVE is also responsible to assist regions and municipalities in the prevention of disasters posed by landslides and snow avalanches. Besides assisting the municipalities through implementation of digital landslides inventories, susceptibility and hazard mapping, areal planning, preparation of guidelines, realization of mitigation measures and helping during emergencies, NVE is developing a regional scale debris flow warning system that use hydrological models that are already available in the flood warning systems. It is well known that the application of rainfall thresholds is not sufficient to evaluate the hazard for debris flows and shallow slides, and soil moisture conditions play a crucial role in the triggering conditions. The information on simulated soil and groundwater conditions and water supply (rain and snowmelt) based on weather forecast, have proved to be useful variables that indicate the potential occurrence of debris flows and shallow slides. Forecasts of runoff and freezing-thawing are also valuable information. The early warning system is using real-time measurements (Discharge; Groundwater level; Soil water content and soil temperature; Snow water equivalent; Meteorological data) and model simulations (a spatially distributed version of the HBV-model and an adapted version of 1-D soil water and energy balance

  19. Whole cell quenched flow analysis.

    Science.gov (United States)

    Chiang, Ya-Yu; Haeri, Sina; Gizewski, Carsten; Stewart, Joanna D; Ehrhard, Peter; Shrimpton, John; Janasek, Dirk; West, Jonathan

    2013-12-03

    This paper describes a microfluidic quenched flow platform for the investigation of ligand-mediated cell surface processes with unprecedented temporal resolution. A roll-slip behavior caused by cell-wall-fluid coupling was documented and acts to minimize the compression and shear stresses experienced by the cell. This feature enables high-velocity (100-400 mm/s) operation without impacting the integrity of the cell membrane. In addition, rotation generates localized convection paths. This cell-driven micromixing effect causes the cell to become rapidly enveloped with ligands to saturate the surface receptors. High-speed imaging of the transport of a Janus particle and fictitious domain numerical simulations were used to predict millisecond-scale biochemical switching times. Dispersion in the incubation channel was characterized by microparticle image velocimetry and minimized by using a horizontal Hele-Shaw velocity profile in combination with vertical hydrodynamic focusing to achieve highly reproducible incubation times (CV = 3.6%). Microfluidic quenched flow was used to investigate the pY1131 autophosphorylation transition in the type I insulin-like growth factor receptor (IGF-1R). This predimerized receptor undergoes autophosphorylation within 100 ms of stimulation. Beyond this demonstration, the extreme temporal resolution can be used to gain new insights into the mechanisms underpinning a tremendous variety of important cell surface events.

  20. LDV measurement, flow visualization and numerical analysis of flow distribution in a close-coupled catalytic converter

    International Nuclear Information System (INIS)

    Kim, Duk Sang; Cho, Yong Seok

    2004-01-01

    Results from an experimental study of flow distribution in a Close-coupled Catalytic Converter (CCC) are presented. The experiments were carried out with a flow measurement system specially designed for this study under steady and transient flow conditions. A pitot tube was a tool for measuring flow distribution at the exit of the first monolith. The flow distribution of the CCC was also measured by LDV system and flow visualization. Results from numerical analysis are also presented. Experimental results showed that the flow uniformity index decreases as flow Reynolds number increases. In steady flow conditions, the flow through each exhaust pipe made some flow concentrations on a specific region of the CCC inlet. The transient test results showed that the flow through each exhaust pipe in the engine firing order, interacted with each other to ensure that the flow distribution was uniform. The results of numerical analysis were qualitatively accepted with experimental results. They supported and helped explain the flow in the entry region of CCC

  1. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  2. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  3. SIMONE: Tool for Data Analysis and Simulation

    International Nuclear Information System (INIS)

    Chudoba, V.; Hnatio, B.; Sharov, P.; Papka, Paul

    2013-06-01

    SIMONE is a software tool based on the ROOT Data Analysis Framework and developed in collaboration of FLNR JINR and iThemba LABS. It is intended for physicists planning experiments and analysing experimental data. The goal of the SIMONE framework is to provide a flexible system, user friendly, efficient and well documented. It is intended for simulation of a wide range of Nuclear Physics experiments. The most significant conditions and physical processes can be taken into account during simulation of the experiment. The user can create his own experimental setup through the access of predefined detector geometries. Simulated data is made available in the same format as for the real experiment for identical analysis of both experimental and simulated data. Significant time reduction is expected during experiment planning and data analysis. (authors)

  4. LFSTAT - An R-Package for Low-Flow Analysis

    Science.gov (United States)

    Koffler, D.; Laaha, G.

    2012-04-01

    When analysing daily streamflow data focusing on low flow and drought, the state of the art is well documented in the Manual on Low-Flow Estimation and Prediction [1] published by the WMO. While it is clear what has to be done, it is not so clear how to preform the analysis and make the calculation as reproducible as possible. Our software solution expands the high preforming statistical open source software package R to analyse daily stream flow data focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) to analyse data in R. Functionality includes estimation of the most important low-flow indices. Beside standardly used flow indices also BFI and Recession constants can be computed. The main applications of L-moment based Extreme value analysis and regional frequency analysis (RFA) are available. Calculation of streamflow deficits is another important feature. The most common graphics are prepared and can easily be modified according to the users preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, flow duration curves as well as double mass curves just to name a few. The package uses a S3-class called lfobj (low-flow objects). Once this objects are created, analysis can be preformed by mouse-click, and a script can be saved to make the analysis easy reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions. Future plans include e.g. report export in odt-file using odf-weave. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package is designed for hydrological research and water management practice, but can also be used in teaching students the first steps in low-flow hydrology.

  5. FlowPing - The New Tool for Throughput and Stress Testing

    Directory of Open Access Journals (Sweden)

    Ondrej Vondrous

    2015-01-01

    Full Text Available This article presents a new tool for network throughput and stress testing. The FlowPing tool is easy to use, and its basic output is very similar to standard Linux ping application. The FlowPing tool is not limited to reach-ability or round trip time testing but is capable of complex UDP based throughput stress testing with rich reporting capabilities on client and server sides. Our new tool implements features, which allow the user to perform tests with variable packet size and traffic rate. All these features can be used in one single test run. This allows the user to use and develop new methodologies for network throughput and stress testing. With the FlowPing tool, it is easy to perform the test with the slowly increasing the amount of network traffic and monitor the behavior of network when the congestion occurs.

  6. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  7. Compressible turbulent flows: aspects of prediction and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, R. [TU Muenchen, Garching (Germany). Fachgebiet Stroemungsmechanik

    2007-03-15

    Compressible turbulent flows are an important element of high-speed flight. Boundary layers developing along fuselage and wings of an aircraft and along engine compressor and turbine blades are compressible and mostly turbulent. The high-speed flow around rockets and through rocket nozzles involves compressible turbulence and flow separation. Turbulent mixing and combustion in scramjet engines is another example where compressibility dominates the flow physics. Although compressible turbulent flows have attracted researchers since the fifties of the last century, they are not completely understood. Especially interactions between compressible turbulence and combustion lead to challenging, yet unsolved problems. Direct numerical simulation (DNS) and large-eddy simulation (LES) represent modern powerful research tools which allow to mimic such flows in great detail and to analyze underlying physical mechanisms, even those which cannot be accessed by the experiment. The present lecture provides a short description of these tools and some of their numerical characteristics. It then describes DNS and LES results of fully-developed channel and pipe flow and highlights effects of compressibility on the turbulence structure. The analysis of pressure fluctuations in such flows with isothermal cooled walls leads to the conclusion that the pressure-strain correlation tensor decreases in the wall layer and that the turbulence anisotropy increases, since the mean density falls off relative to the incompressible flow case. Similar increases in turbulence anisotropy due to compressibility are observed in inert and reacting temporal mixing layers. The nature of the pressure fluctuations is however two-facetted. While inert compressible mixing layers reveal wave-propagation effects in the pressure and density fluctuations, compressible reacting mixing layers seem to generate pressure fluctuations that are controlled by the time-rate of change of heat release and mean density

  8. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  9. Programming heterogeneous MPSoCs tool flows to close the software productivity gap

    CERN Document Server

    Castrillón Mazo, Jerónimo

    2014-01-01

    This book provides embedded software developers with techniques for programmingheterogeneous Multi-Processor Systems-on-Chip (MPSoCs), capable of executing multiple applications simultaneously. It describes a set of algorithms and methodologies to narrow the software productivity gap, as well as an in-depth description of the underlying problems and challenges of today’s programming practices. The authors present four different tool flows: A parallelism extraction flow for applications writtenusing the C programming language, a mapping and scheduling flow for parallel applications, a special mapping flow for baseband applications in the context of Software Defined Radio (SDR) and a final flow for analyzing multiple applications at design time. The tool flows are evaluated on Virtual Platforms (VPs), which mimic different characteristics of state-of-the-art heterogeneous MPSoCs.   • Provides a novel set of algorithms and methodologies for programming heterogeneous Multi-Processor Systems-on-Chip (MPSoCs)...

  10. Reticulocyte analysis using flow cytometry.

    Science.gov (United States)

    Corberand, J X

    1996-12-01

    Automation of the reticulocyte count by means of flow cytometry has considerably improved the quality of this investigation. This article deals firstly with the reasons for the poor performance of the microscopic technique and with the physiological principles underlying identification and classification of reticulocytes using RNA labeling. It then outlines the automated methods currently on the market, which can be classified in three categories: a) "general-purpose" cytofluorometers, which in clinical laboratories usually deal with lymphocyte immunophenotyping; b) the only commercially available cytofluorometer dedicated to the reticulocyte count; this automat has the advantage of requiring no human intervention as it merely needs to be fed with samples; c) hematology analyzers with specific modules for automatic counting of reticulocytes previously incubated with a non-fluorescent dye. Of the various fluorescent markers available, thiazole orange, DEQTC iodide and auramine are most often used for this basic hematology test. The quality of the count, the availability of new reticulocyte indices (maturation index, percentage of young reticulocytes) and rapidity of the count give this test renewed value in the practical approach to the diagnosis of anemia, and also open new perspectives in the surveillance of aplastic anemia after chemotherapy or bone marrow grafting.

  11. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    Energy Technology Data Exchange (ETDEWEB)

    Grout, R. W.

    2013-10-01

    This report describes IHT, a toolkit for computing radiative heat exchange between particles. Well suited for insolation absorption computations, it is also has potential applications in combustion (sooting flames), biomass gasification processes and similar processes. The algorithm is based on the 'Photon Monte Carlo' approach and implemented in a library that can be interfaced with a variety of computational fluid dynamics codes to analyze radiative heat transfer in particle-laden flows. The emphasis in this report is on the data structures and organization of IHT for developers seeking to use the IHT toolkit to add Photon Monte Carlo capabilities to their own codes.

  12. Continuous flow chemistry: a discovery tool for new chemical reactivity patterns.

    Science.gov (United States)

    Hartwig, Jan; Metternich, Jan B; Nikbin, Nikzad; Kirschning, Andreas; Ley, Steven V

    2014-06-14

    Continuous flow chemistry as a process intensification tool is well known. However, its ability to enable chemists to perform reactions which are not possible in batch is less well studied or understood. Here we present an example, where a new reactivity pattern and extended reaction scope has been achieved by transferring a reaction from batch mode to flow. This new reactivity can be explained by suppressing back mixing and precise control of temperature in a flow reactor set up.

  13. Continuous flow chemistry: a discovery tool for new chemical reactivity patterns

    OpenAIRE

    Hartwig, Jan; Metternich, Jan B.; Nikbin, Nikzad; Kirschning, Andreas; Ley, Steven V.

    2014-01-01

    Continuous flow chemistry as a process intensification tool is well known. However, its ability to enable chemists to perform reactions which are not possible in batch is less well studied or understood. Here we present an example, where a new reactivity pattern and extended reaction scope has been achieved by transferring a reaction from batch mode to flow. This new reactivity can be explained by suppressing back mixing and precise control of temperature in a flow reactor set up.

  14. Correlation dimension estimate and its potential use in analysis of gas-solid flows

    DEFF Research Database (Denmark)

    Yin, Chungen; Rosendahl, Lasse Aistrup; Kær, Søren Knudsen

    2005-01-01

    Gas-solid flows are nonlinear systems. Therefore state-space analysis, a tool developed within the framework of nonlinear dynamics, could provide more useful insights into complex gas-solid flows. One of the positive aspects of state-space analysis is that the major properties of a system can be ...

  15. Overview of the tool-flow for the Montium Processing Tile

    NARCIS (Netherlands)

    Smit, Gerardus Johannes Maria; Rosien, M.A.J.; Guo, Y.; Heysters, P.M.

    This paper presents an overview of a tool chain to support a transformational design methodology. The tool can be used to compile code written in a high level source language, like C, to a coarse grain reconfigurable architecture. The source code is first translated into a Control Data Flow Graph

  16. Control Flow Analysis for BioAmbients

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis; Priami, C.

    2007-01-01

    This paper presents a static analysis for investigating properties of biological systems specified in BioAmbients. We exploit the control flow analysis to decode the bindings of variables induced by communications and to build a relation of the ambients that can interact with each other. We...

  17. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  18. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  19. A regionally-linked, dynamic material flow modelling tool for rolled, extruded and cast aluminium products

    DEFF Research Database (Denmark)

    Bertram, M.; Ramkumar, S.; Rechberger, H.

    2017-01-01

    A global aluminium flow modelling tool, comprising nine trade linked regions, namely China, Europe, Japan, Middle East, North America, Other Asia, Other Producing Countries, South America and Rest of World, has been developed. The purpose of the Microsoft Excel-based tool is the quantification...... of regional stocks and flows of rolled, extruded and casting alloys across space and over time, giving the industry the ability to evaluate the potential to recycle aluminium scrap most efficiently. The International Aluminium Institute will update the tool annually and publish a visualisation of results...

  20. Standardised risk analysis as a communication tool

    International Nuclear Information System (INIS)

    Pluess, Ch.; Montanarini, M.; Bernauer, M.

    1998-01-01

    Full text of publication follows: several European countries require a risk analysis for the production, storage or transport a dangerous goods. This requirement imposes considerable administrative effort for some sectors of the industry. In order to minimize the effort of such studies, a generic risk analysis for an industrial sector proved to help. Standardised procedures can consequently be derived for efficient performance of the risk investigations. This procedure was successfully established in Switzerland for natural gas transmission lines and fossil fuel storage plants. The development process of the generic risk analysis involved an intense discussion between industry and authorities about methodology of assessment and the criteria of acceptance. This process finally led to scientific consistent modelling tools for risk analysis and to an improved communication from the industry to the authorities and the public. As a recent example, the Holland-Italy natural gas transmission pipeline is demonstrated, where this method was successfully employed. Although this pipeline traverses densely populated areas in Switzerland, using this established communication method, the risk problems could be solved without delaying the planning process. (authors)

  1. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  2. Numerical studies of the polymer melt flow in the extruder screw channel and the forming tool

    Science.gov (United States)

    Ershov, S. V.; Trufanova, N. M.

    2017-06-01

    To date, polymer compositions based on polyethylene or PVC is widely used as insulating materials. These materials processing conjugate with a number of problems during selection of the rational extrusion regimes. To minimize the time and cost when determining the technological regime uses mathematical modeling techniques. The paper discusses heat and mass transfer processes in the extruder screw channel, output adapter and the cable head. During the study were determined coefficients for three rheological models based on obtained viscosity vs. shear rate experimental data. Also a comparative analysis of this viscosimetric laws application possibility for studying polymer melt flow during its processing on the extrusion equipment was held. As a result of numerical study the temperature, viscosity and shear rate fields in the extruder screw channel and forming tool were obtained.

  3. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  4. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  5. Analysis of the Effectiveness of the Retire Tool When Deciding Between High 36 Retirement and Blended TSP Retirement

    Science.gov (United States)

    2016-12-01

    22202-4302, and to the Office of Management and Budget , Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave...VALUE OF THE TSP ...........16 D. ELEMENTS OF RETIRE TOOL ANALYSIS .....................................18 1. Cash Flow Analysis...23 3. Cash Flow Comparison ...............................................................24 4. Volatility Analysis

  6. PolNet: A Tool to Quantify Network-Level Cell Polarity and Blood Flow in Vascular Remodeling.

    Science.gov (United States)

    Bernabeu, Miguel O; Jones, Martin L; Nash, Rupert W; Pezzarossa, Anna; Coveney, Peter V; Gerhardt, Holger; Franco, Claudio A

    2018-05-08

    In this article, we present PolNet, an open-source software tool for the study of blood flow and cell-level biological activity during vessel morphogenesis. We provide an image acquisition, segmentation, and analysis protocol to quantify endothelial cell polarity in entire in vivo vascular networks. In combination, we use computational fluid dynamics to characterize the hemodynamics of the vascular networks under study. The tool enables, to our knowledge for the first time, a network-level analysis of polarity and flow for individual endothelial cells. To date, PolNet has proven invaluable for the study of endothelial cell polarization and migration during vascular patterning, as demonstrated by two recent publications. Additionally, the tool can be easily extended to correlate blood flow with other experimental observations at the cellular/molecular level. We release the source code of our tool under the Lesser General Public License. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  7. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  8. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  9. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  10. Riparian Cottonwood Ecosystems and Regulated Flows in Kootenai and Yakima Sub-Basins : Volume III (Overview and Tools).

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, Bob; Braatne, Jeffrey H.

    2001-10-01

    Riparian vegetation and especially cottonwood and willow plant communities are dependent on normative flows and especially, spring freshette, to provide conditions for recruitment. These plant communities therefore share much in common with a range of fish species that require natural flow conditions to stimulate reproduction. We applied tools and techniques developed in other areas to assess riparian vegetation in two very different sub-basins within the Columbia Basin. Our objectives were to: Document the historic impact of human activity on alluvial floodplain areas in both sub-basins; Provide an analysis of the impacts of flow regulation on riparian vegetation in two systems with very different flow regulation systems; Demonstrate that altered spring flows will, in fact, result in recruitment to cottonwood stands, given other land uses impacts on each river and the limitations imposed by other flow requirements; and Assess the applicability of remote sensing tools for documenting the distribution and health of cottonwood stands and riparian vegetation that can be used in other sub-basins.

  11. Detecting Human Hydrologic Alteration from Diversion Hydropower Requires Universal Flow Prediction Tools: A Proposed Framework for Flow Prediction in Poorly-gauged, Regulated Rivers

    Science.gov (United States)

    Kibler, K. M.; Alipour, M.

    2016-12-01

    Achieving the universal energy access Sustainable Development Goal will require great investment in renewable energy infrastructure in the developing world. Much growth in the renewable sector will come from new hydropower projects, including small and diversion hydropower in remote and mountainous regions. Yet, human impacts to hydrological systems from diversion hydropower are poorly described. Diversion hydropower is often implemented in ungauged rivers, thus detection of impact requires flow analysis tools suited to prediction in poorly-gauged and human-altered catchments. We conduct a comprehensive analysis of hydrologic alteration in 32 rivers developed with diversion hydropower in southwestern China. As flow data are sparse, we devise an approach for estimating streamflow during pre- and post-development periods, drawing upon a decade of research into prediction in ungauged basins. We apply a rainfall-runoff model, parameterized and forced exclusively with global-scale data, in hydrologically-similar gauged and ungauged catchments. Uncertain "soft" data are incorporated through fuzzy numbers and confidence-based weighting, and a multi-criteria objective function is applied to evaluate model performance. Testing indicates that the proposed framework returns superior performance (NSE = 0.77) as compared to models parameterized by rote calibration (NSE = 0.62). Confident that the models are providing `the right answer for the right reasons', our analysis of hydrologic alteration based on simulated flows indicates statistically significant hydrologic effects of diversion hydropower across many rivers. Mean annual flows, 7-day minimum and 7-day maximum flows decreased. Frequency and duration of flow exceeding Q25 decreased while duration of flows sustained below the Q75 increased substantially. Hydrograph rise and fall rates and flow constancy increased. The proposed methodology may be applied to improve diversion hydropower design in data-limited regions.

  12. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  13. Optoelectronic iron detectors for pharmaceutical flow analysis.

    Science.gov (United States)

    Rybkowska, Natalia; Koncki, Robert; Strzelak, Kamil

    2017-10-25

    Compact flow-through optoelectronic detectors fabricated by pairing of light emitting diodes have been applied for development of economic flow analysis systems dedicated for iron ions determination. Three analytical methods with different chromogens selectively recognizing iron ions have been compared. Ferrozine and ferene S based methods offer higher sensitivity and slightly lower detection limits than method with 1,10-phenantroline, but narrower ranges of linear response. Each system allows detection of iron in micromolar range of concentration with comparable sample throughput (20 injections per hour). The developed flow analysis systems have been successfully applied for determination of iron in diet supplements. The utility of developed analytical systems for iron release studies from drug formulations has also been demonstrated. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  15. Analysis tools for discovering strong parity violation at hadron colliders

    Science.gov (United States)

    Backović, Mihailo; Ralston, John P.

    2011-07-01

    Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or “azimuthal flow.” Analysis uses the representations of the orthogonal group O(2) and dihedral groups DN necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single “reaction plane.” Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of “event-shape sorting” to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

  16. Retro-review of flow injection analysis

    DEFF Research Database (Denmark)

    Ruzicka, Jaromir; Hansen, Elo Harald

    2008-01-01

    It is indeed unusual for authors to review their own monograph – J. Ruzicka, E.H. Hansen, Flow Injection Analysis, 2nd Edition, Wiley, Chichester, West Sussex, UK, 1988. – and even more so if the book was published 20 years ago. Yet such an exercise might yield a perspective on the progress of an...

  17. Microfabricated tools for manipulation and analysis of magnetic microcarriers

    International Nuclear Information System (INIS)

    Tondra, Mark; Popple, Anthony; Jander, Albrecht; Millen, Rachel L.; Pekas, Nikola; Porter, Marc D.

    2005-01-01

    Tools for manipulating and detecting magnetic microcarriers are being developed with microscale features. Microfabricated giant magnetoresistive (GMR) sensors and wires are used for detection, and for creating high local field gradients. Microfluidic structures are added to control flow, and positioning of samples and microcarriers. These tools are designed for work in analytical chemistry and biology

  18. THE SMALL BODY GEOPHYSICAL ANALYSIS TOOL

    Science.gov (United States)

    Bercovici, Benjamin; McMahon, Jay

    2017-10-01

    The Small Body Geophysical Analysis Tool (SBGAT) that we are developing aims at providing scientists and mission designers with a comprehensive, easy to use, open-source analysis tool. SBGAT is meant for seamless generation of valuable simulated data originating from small bodies shape models, combined with advanced shape-modification properties.The current status of SBGAT is as follows:The modular software architecture that was specified in the original SBGAT proposal was implemented in the form of two distinct packages: a dynamic library SBGAT Core containing the data structure and algorithm backbone of SBGAT, and SBGAT Gui which wraps the former inside a VTK, Qt user interface to facilitate user/data interaction. This modular development facilitates maintenance and addi- tion of new features. Note that SBGAT Core can be utilized independently from SBGAT Gui.SBGAT is presently being hosted on a GitHub repository owned by SBGAT’s main developer. This repository is public and can be accessed at https://github.com/bbercovici/SBGAT. Along with the commented code, one can find the code documentation at https://bbercovici.github.io/sbgat-doc/index.html. This code documentation is constently updated in order to reflect new functionalities.SBGAT’s user’s manual is available at https://github.com/bbercovici/SBGAT/wiki. This document contains a comprehensive tutorial indicating how to retrieve, compile and run SBGAT from scratch.Some of the upcoming development goals are listed hereafter. First, SBGAT's dynamics module will be extented: the PGM algorithm is the only type of analysis method currently implemented. Future work will therefore consists in broadening SBGAT’s capabilities with the Spherical Harmonics Expansion of the gravity field and the calculation of YORP coefficients. Second, synthetic measurements will soon be available within SBGAT. The software should be able to generate synthetic observations of different type (radar, lightcurve, point clouds

  19. Risk analysis as a decision tool

    International Nuclear Information System (INIS)

    Yadigaroglu, G.; Chakraborty, S.

    1985-01-01

    From 1983 - 1985 a lecture series entitled ''Risk-benefit analysis'' was held at the Swiss Federal Institute of Technology (ETH), Zurich, in cooperation with the Central Department for the Safety of Nuclear Installations of the Swiss Federal Agency of Energy Economy. In that setting the value of risk-oriented evaluation models as a decision tool in safety questions was discussed on a broad basis. Experts of international reputation from the Federal Republic of Germany, France, Canada, the United States and Switzerland have contributed to report in this joint volume on the uses of such models. Following an introductory synopsis on risk analysis and risk assessment the book deals with practical examples in the fields of medicine, nuclear power, chemistry, transport and civil engineering. Particular attention is paid to the dialogue between analysts and decision makers taking into account the economic-technical aspects and social values. The recent chemical disaster in the Indian city of Bhopal again signals the necessity of such analyses. All the lectures were recorded individually. (orig./HP) [de

  20. Information flow analysis of interactome networks.

    Directory of Open Access Journals (Sweden)

    Patrycja Vasilyev Missiuro

    2009-04-01

    Full Text Available Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins central to the transmission of biological information throughout the network. In the information flow analysis, we represent an interactome network as an electrical circuit, where interactions are modeled as resistors and proteins as interconnecting junctions. Construing the propagation of biological signals as flow of electrical current, our method calculates an information flow score for every protein. Unlike previous metrics of network centrality such as degree or betweenness that only consider topological features, our approach incorporates confidence scores of protein-protein interactions and automatically considers all possible paths in a network when evaluating the importance of each protein. We apply our method to the interactome networks of Saccharomyces cerevisiae and Caenorhabditis elegans. We find that the likelihood of observing lethality and pleiotropy when a protein is eliminated is positively correlated with the protein's information flow score. Even among proteins of low degree or low betweenness, high information scores serve as a strong predictor of loss-of-function lethality or pleiotropy. The correlation between information flow scores and phenotypes supports our hypothesis that the proteins of high information flow reside in central positions in interactome networks. We also show that the ranks of information flow scores are more consistent than that of betweenness when a large amount of noisy data is added to an interactome. Finally, we

  1. A biological tool to assess flow connectivity in reference temporary streams from the Mediterranean Basin

    Energy Technology Data Exchange (ETDEWEB)

    Cid, N., E-mail: ncid@ub.edu [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain); Verkaik, I. [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain); García-Roger, E.M. [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain); Institut Cavanilles de Biodiversitat i Biologia Evolutiva, Universitat de València (Spain); Rieradevall, M.; Bonada, N. [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain); Sánchez-Montoya, M.M. [Department of Ecology and Hydrology, Regional Campus of International Excellence “Campus Mare Nostrum”—University of Murcia (Spain); Leibniz-Institute of Freshwater Ecology and Inland Fisheries (IGB), Berlin (Germany); Gómez, R.; Suárez, M.L.; Vidal-Abarca, M.R. [Department of Ecology and Hydrology, Regional Campus of International Excellence “Campus Mare Nostrum”—University of Murcia (Spain); Demartini, D.; Buffagni, A.; Erba, S. [Instituto di Ricerca Sulle Acque (CNR-IRSA) (Italy); Karaouzas, I.; Skoulikidis, N. [Hellenic Center for Marine Research (HCMR) (Greece); Prat, N. [Grup de Recerca “Freshwater Ecology and Management (FEM)”, Departament d' Ecologia, Universitat de Barcelona, Catalonia (Spain)

    2016-01-01

    Many streams in the Mediterranean Basin have temporary flow regimes. While timing for seasonal drought is predictable, they undergo strong inter-annual variability in flow intensity. This high hydrological variability and associated ecological responses challenge the ecological status assessment of temporary streams, particularly when setting reference conditions. This study examined the effects of flow connectivity in aquatic macroinvertebrates from seven reference temporary streams across the Mediterranean Basin where hydrological variability and flow conditions are well studied. We tested for the effect of flow cessation on two streamflow indices and on community composition, and, by performing random forest and classification tree analyses we identified important biological predictors for classifying the aquatic state either as flowing or disconnected pools. Flow cessation was critical for one of the streamflow indices studied and for community composition. Macroinvertebrate families found to be important for classifying the aquatic state were Hydrophilidae, Simuliidae, Hydropsychidae, Planorbiidae, Heptageniidae and Gerridae. For biological traits, trait categories associated to feeding habits, food, locomotion and substrate relation were the most important and provided more accurate predictions compared to taxonomy. A combination of selected metrics and associated thresholds based on the most important biological predictors (i.e. Bio-AS Tool) were proposed in order to assess the aquatic state in reference temporary streams, especially in the absence of hydrological data. Although further development is needed, the tool can be of particular interest for monitoring, restoration, and conservation purposes, representing an important step towards an adequate management of temporary rivers not only in the Mediterranean Basin but also in other regions vulnerable to the effects of climate change. - Highlights: • The effect of flow connectivity on macroinvertebrate

  2. Gaseous slip flow analysis of a micromachined flow sensor for ultra small flow applications

    Science.gov (United States)

    Jang, Jaesung; Wereley, Steven T.

    2007-02-01

    The velocity slip of a fluid at a wall is one of the most typical phenomena in microscale gas flows. This paper presents a flow analysis considering the velocity slip in a capacitive micro gas flow sensor based on pressure difference measurements along a microchannel. The tangential momentum accommodation coefficient (TMAC) measurements of a particular channel wall in planar microchannels will be presented while the previous micro gas flow studies have been based on the same TMACs on both walls. The sensors consist of a pair of capacitive pressure sensors, inlet/outlet and a microchannel. The main microchannel is 128.0 µm wide, 4.64 µm deep and 5680 µm long, and operated under nearly atmospheric conditions where the outlet Knudsen number is 0.0137. The sensor was fabricated using silicon wet etching, ultrasonic drilling, deep reactive ion etching (DRIE) and anodic bonding. The capacitance change of the sensor and the mass flow rate of nitrogen were measured as the inlet-to-outlet pressure ratio was varied from 1.00 to 1.24. The measured maximum mass flow rate was 3.86 × 10-10 kg s-1 (0.019 sccm) at the highest pressure ratio tested. As the pressure difference increased, both the capacitance of the differential pressure sensor and the flow rate through the main microchannel increased. The laminar friction constant f sdot Re, an important consideration in sensor design, varied from the incompressible no-slip case and the mass sensitivity and resolution of this sensor were discussed. Using the current slip flow formulae, a microchannel with much smaller mass flow rates can be designed at the same pressure ratios.

  3. OpenFlow Deployment and Concept Analysis

    Directory of Open Access Journals (Sweden)

    Tomas Hegr

    2013-01-01

    Full Text Available Terms such as SDN and OpenFlow (OF are often used in the research and development of data networks. This paper deals with the analysis of the current state of OpenFlow protocol deployment options as it is the only real representative protocol that enables the implementation of Software Defined Networking outside an academic world. There is introduced an insight into the current state of the OpenFlow specification development at various levels is introduced. The possible limitations associated with this concept in conjunction with the latest version (1.3 of the specification published by ONF are also presented. In the conclusion there presented a demonstrative security application addressing the lack of IPv6 support in real network devices since most of today's switches and controllers support only OF v1.0.

  4. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  5. Analysis of groundwater flow beneath ice sheets

    Energy Technology Data Exchange (ETDEWEB)

    Boulton, G. S.; Zatsepin, S.; Maillot, B. [Univ. of Edinburgh (United Kingdom). Dept. of Geology and Geophysics

    2001-03-01

    The large-scale pattern of subglacial groundwater flow beneath European ice sheets was analysed in a previous report. It was based on a two-dimensional flowline model. In this report, the analysis is extended to three dimensions by exploring the interactions between groundwater and tunnel flow. A theory is developed which suggests that the large-scale geometry of the hydraulic system beneath an ice sheet is a coupled, self-organising system. In this system the pressure distribution along tunnels is a function of discharge derived from basal meltwater delivered to tunnels by groundwater flow, and the pressure along tunnels itself sets the base pressure which determines the geometry of catchments and flow towards the tunnel. The large-scale geometry of tunnel distribution is a product of the pattern of basal meltwater production and the transmissive properties of the bed. The tunnel discharge from the ice margin of the glacier, its seasonal fluctuation and the sedimentary characteristics of eskers are largely determined by the discharge of surface meltwater which penetrates to the bed in the terminal zone. The theory explains many of the characteristics of esker systems and can account for tunnel valleys. It is concluded that the large-scale hydraulic regime beneath ice sheets is largely a consequence of groundwater/tunnel flow interactions and that it is essential similar to non-glacial hydraulic regimes. Experimental data from an Icelandic glacier, which demonstrates measured relationships between subglacial tunnel flow and groundwater flow during the transition from summer to winter seasons for a modern glacier, and which support the general conclusions of the theory is summarised in an appendix.

  6. Analysis of groundwater flow beneath ice sheets

    International Nuclear Information System (INIS)

    Boulton, G. S.; Zatsepin, S.; Maillot, B.

    2001-03-01

    The large-scale pattern of subglacial groundwater flow beneath European ice sheets was analysed in a previous report. It was based on a two-dimensional flowline model. In this report, the analysis is extended to three dimensions by exploring the interactions between groundwater and tunnel flow. A theory is developed which suggests that the large-scale geometry of the hydraulic system beneath an ice sheet is a coupled, self-organising system. In this system the pressure distribution along tunnels is a function of discharge derived from basal meltwater delivered to tunnels by groundwater flow, and the pressure along tunnels itself sets the base pressure which determines the geometry of catchments and flow towards the tunnel. The large-scale geometry of tunnel distribution is a product of the pattern of basal meltwater production and the transmissive properties of the bed. The tunnel discharge from the ice margin of the glacier, its seasonal fluctuation and the sedimentary characteristics of eskers are largely determined by the discharge of surface meltwater which penetrates to the bed in the terminal zone. The theory explains many of the characteristics of esker systems and can account for tunnel valleys. It is concluded that the large-scale hydraulic regime beneath ice sheets is largely a consequence of groundwater/tunnel flow interactions and that it is essential similar to non-glacial hydraulic regimes. Experimental data from an Icelandic glacier, which demonstrates measured relationships between subglacial tunnel flow and groundwater flow during the transition from summer to winter seasons for a modern glacier, and which support the general conclusions of the theory is summarised in an appendix

  7. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  8. Developing a tool for assessing competency in root cause analysis.

    Science.gov (United States)

    Gupta, Priyanka; Varkey, Prathibha

    2009-01-01

    Root cause analysis (RCA) is a tool for identifying the key cause(s) contributing to a sentinel event or near miss. Although training in RCA is gaining popularity in medical education, there is no published literature on valid or reliable methods for assessing competency in the same. A tool for assessing competency in RCA was pilot tested as part of an eight-station Objective Structured Clinical Examination that was conducted at the completion of a three-week quality improvement (QI) curriculum for the Mayo Clinic Preventive Medicine and Endocrinology fellowship programs. As part of the curriculum, fellows completed a QI project to enhance physician communication of the diagnosis and treatment plan at the end of a patient visit. They had a didactic session on RCA, followed by process mapping of the information flow at the project clinic, after which fellows conducted an actual RCA using the Ishikawa fishbone diagram. For the RCA competency assessment, fellows performed an RCA regarding a scenario describing an adverse medication event and provided possible solutions to prevent such errors in the future. All faculty strongly agreed or agreed that they were able to accurately assess competency in RCA using the tool. Interrater reliability for the global competency rating and checklist scoring were 0.96 and 0.85, respectively. Internal consistency (Cronbach's alpha) was 0.76. Six of eight of the fellows found the difficulty level of the test to be optimal. Assessment methods must accompany education programs to ensure that graduates are competent in QI methodologies and are able to apply them effectively in the workplace. The RCA assessment tool was found to be a valid, reliable, feasible, and acceptable method for assessing competency in RCA. Further research is needed to examine its predictive validity and generalizability.

  9. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  10. Microscopic analysis of Hopper flow with ellipsoidal particles

    Science.gov (United States)

    Liu, Sida; Zhou, Zongyan; Zou, Ruiping; Pinson, David; Yu, Aibing

    2013-06-01

    Hoppers are widely used in process industries. With such widespread application, difficulties in achieving desired operational behaviors have led to extensive experimental and mathematical studies in the past decades. Particularly, the discrete element method has become one of the most important simulation tools for design and analysis. So far, most studies are on spherical particles for computational convenience. In this work, ellipsoidal particles are used as they can represent a large variation of particle shapes. Hopper flow with ellipsoidal particles is presented highlighting the effect of particle shape on the microscopic properties.

  11. Numerical flow analysis of axial flow compressor for steady and unsteady flow cases

    Science.gov (United States)

    Prabhudev, B. M.; Satish kumar, S.; Rajanna, D.

    2017-07-01

    Performance of jet engine is dependent on the performance of compressor. This paper gives numerical study of performance characteristics for axial compressor. The test rig is present at CSIR LAB Bangalore. Flow domains are meshed and fluid dynamic equations are solved using ANSYS package. Analysis is done for six different speeds and for operating conditions like choke, maximum efficiency & before stall point. Different plots are compared and results are discussed. Shock displacement, vortex flows, leakage patterns are presented along with unsteady FFT plot and time step plot.

  12. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  13. A methodology for online visualization of the energy flow in a machine tool

    DEFF Research Database (Denmark)

    Mohammadi, Ali; Züst, Simon; Mayr, Josef

    2017-01-01

    the machining process and by this increasing its energy efficiency. This study intents to propose a method which has the capability of real-time monitoring of the entire energetic flows in a CNC machine tool including motors, pumps and cooling fluid. The structure of this approach is based on categorizing...

  14. Deep Packet/Flow Analysis using GPUs

    Energy Technology Data Exchange (ETDEWEB)

    Gong, Qian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Wu, Wenji [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); DeMar, Phil [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2017-11-12

    Deep packet inspection (DPI) faces severe performance challenges in high-speed networks (40/100 GE) as it requires a large amount of raw computing power and high I/O throughputs. Recently, researchers have tentatively used GPUs to address the above issues and boost the performance of DPI. Typically, DPI applications involve highly complex operations in both per-packet and per-flow data level, often in real-time. The parallel architecture of GPUs fits exceptionally well for per-packet network traffic processing. However, for stateful network protocols such as TCP, their data stream need to be reconstructed in a per-flow level to deliver a consistent content analysis. Since the flow-centric operations are naturally antiparallel and often require large memory space for buffering out-of-sequence packets, they can be problematic for GPUs, whose memory is normally limited to several gigabytes. In this work, we present a highly efficient GPU-based deep packet/flow analysis framework. The proposed design includes a purely GPU-implemented flow tracking and TCP stream reassembly. Instead of buffering and waiting for TCP packets to become in sequence, our framework process the packets in batch and uses a deterministic finite automaton (DFA) with prefix-/suffix- tree method to detect patterns across out-of-sequence packets that happen to be located in different batches. In conclusion, evaluation shows that our code can reassemble and forward tens of millions of packets per second and conduct a stateful signature-based deep packet inspection at 55 Gbit/s using an NVIDIA K40 GPU.

  15. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  16. Cross-flow analysis of injection wells in a multilayered reservoir

    Directory of Open Access Journals (Sweden)

    Mohammadreza Jalali

    2016-09-01

    Natural and forced cross-flow is modeled for some injection wells in an oil reservoir located at North Sea. The solution uses a transient implicit finite difference approach for multiple sand layers with different permeabilities separated by impermeable shale layers. Natural and forced cross-flow rates for each reservoir layer during shut-in are calculated and compared with different production logging tool (PLT measurements. It appears that forced cross-flow is usually more prolonged and subject to a higher flow rate when compared with natural cross-flow, and is thus worthy of more detailed analysis.

  17. Flow analysis of the ophthalmic artery

    Energy Technology Data Exchange (ETDEWEB)

    Harada, Kuniaki; Hashimoto, Masato; Bandoh, Michio; Odawara, Yoshihiro; Kamagata, Masaki; Shirase, Ryuji [Sapporo Medical Univ. (Japan). Hospital

    2003-02-01

    The purpose of this study was to analyze the hemodynamics of ophthalmic artery flow using phase contrast MR angiography (PC-MRA). A total of 14 eyes from 10 normal volunteers and a patient with normal tension glaucoma (NTG) were analyzed. The optimal conditions were time repetition (TR)/echo time (TE)/flip angle (FA)/nex=40 ms/minimum/90 deg/2, field of view (FOV)=6 cm, matrix size=256 x 256. The resistive index (RI) and pulsatillity index (PI) values were significantly raised in the patient with NTG when compared to the control group. We therefore believe that PC-MRA may be a useful clinical tool for the assessment of the mechanism of NTG. (author)

  18. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  19. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  20. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    DEFF Research Database (Denmark)

    Lorentzen, Torsten; Blanke, Mogens

    2004-01-01

    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system’s...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur........ The list of such variables and functional relations constitute the system’s structure graph. Normal operation means all functional relations are intact. Should faults occur, one or more functional relations cease to be valid. In a structure graph, this is seen as the disappearance of one or more nodes...

  1. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.

    2004-01-01

    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance

  2. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  3. FlowMax: A Computational Tool for Maximum Likelihood Deconvolution of CFSE Time Courses.

    Directory of Open Access Journals (Sweden)

    Maxim Nikolaievich Shokhirev

    Full Text Available The immune response is a concerted dynamic multi-cellular process. Upon infection, the dynamics of lymphocyte populations are an aggregate of molecular processes that determine the activation, division, and longevity of individual cells. The timing of these single-cell processes is remarkably widely distributed with some cells undergoing their third division while others undergo their first. High cell-to-cell variability and technical noise pose challenges for interpreting popular dye-dilution experiments objectively. It remains an unresolved challenge to avoid under- or over-interpretation of such data when phenotyping gene-targeted mouse models or patient samples. Here we develop and characterize a computational methodology to parameterize a cell population model in the context of noisy dye-dilution data. To enable objective interpretation of model fits, our method estimates fit sensitivity and redundancy by stochastically sampling the solution landscape, calculating parameter sensitivities, and clustering to determine the maximum-likelihood solution ranges. Our methodology accounts for both technical and biological variability by using a cell fluorescence model as an adaptor during population model fitting, resulting in improved fit accuracy without the need for ad hoc objective functions. We have incorporated our methodology into an integrated phenotyping tool, FlowMax, and used it to analyze B cells from two NFκB knockout mice with distinct phenotypes; we not only confirm previously published findings at a fraction of the expended effort and cost, but reveal a novel phenotype of nfkb1/p105/50 in limiting the proliferative capacity of B cells following B-cell receptor stimulation. In addition to complementing experimental work, FlowMax is suitable for high throughput analysis of dye dilution studies within clinical and pharmacological screens with objective and quantitative conclusions.

  4. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  5. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  6. Tool Supported Analysis of Web Services Protocols

    DEFF Research Database (Denmark)

    Marques, Abinoam P.; Ravn, Anders Peter; Srba, Jiri

    2011-01-01

    We describe an abstract protocol model suitable for modelling of web services and other protocols communicating via unreliable, asynchronous communication channels. The model is supported by a tool chain where the first step translates tables with state/transition protocol descriptions, often used...... e.g. in the design of web services protocols, into an intermediate XML format. We further translate this format into a network of communicating state machines directly suitable for verification in the model checking tool UPPAAL. We introduce two types of communication media abstractions in order...

  7. SIMMER as a safety analysis tool

    International Nuclear Information System (INIS)

    Smith, L.L.; Bell, C.R.; Bohl, W.R.; Bott, T.F.; Dearing, J.F.; Luck, L.B.

    1982-01-01

    SIMMER has been used for numerous applications in fast reactor safety, encompassing both accident and experiment analysis. Recent analyses of transition-phase behavior in potential core disruptive accidents have integrated SIMMER testing with the accident analysis. Results of both the accident analysis and the verification effort are presented as a comprehensive safety analysis program

  8. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  9. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  10. Lagrangian structure of flows in the Chesapeake Bay: challenges and perspectives on the analysis of estuarine flows

    Directory of Open Access Journals (Sweden)

    M. Branicki

    2010-03-01

    Full Text Available In this work we discuss applications of Lagrangian techniques to study transport properties of flows generated by shallow water models of estuarine flows. We focus on the flow in the Chesapeake Bay generated by Quoddy (see Lynch and Werner, 1991, a finite-element (shallow water model adopted to the bay by Gross et al. (2001. The main goal of this analysis is to outline the potential benefits of using Lagrangian tools for both understanding transport properties of such flows, and for validating the model output and identifying model deficiencies. We argue that the currently available 2-D Lagrangian tools, including the stable and unstable manifolds of hyperbolic trajectories and techniques exploiting 2-D finite-time Lyapunov exponent fields, are of limited use in the case of partially mixed estuarine flows. A further development and efficient implementation of three-dimensional Lagrangian techniques, as well as improvements in the shallow-water modelling of 3-D velocity fields, are required for reliable transport analysis in such flows. Some aspects of the 3-D trajectory structure in the Chesapeake Bay, based on the Quoddy output, are also discussed.

  11. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  12. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  13. Analysis of magnetohydrodynamic flow in annular duct

    International Nuclear Information System (INIS)

    Yoo, G.J.; Choi, H.K.; Eun, J.J.

    2004-01-01

    In various types of reactors, fluid is required to be circulated inside the vessel to be an efficient coolant. For flowing metal coolant the electromagnetic pump can be an efficient device for providing the driving force. Numerical analysis is performed for magnetic and magnetohydrodynamic (MHD) flow fields in an electromagnetic pump. A finite volume method is applied to solve governing equations of magnetic field and the Navier-Stokes equations. Vector and scalar potential methods are adopted to obtain the electric and magnetic fields and the resulting Lorentz force in solving Maxwell equations. The magnetic field and velocity distributions are found to be affected by the phase of applied electric current and the magnitude of the Reynolds number. Computational results indicate that the magnetic flux distribution with changing phase of input electric current is characterized by pairs of counter-rotating closed loops. The axial velocity distributions are represented with S-type profiles for the case of the r-direction of Lorentz force dominated flows. (authors)

  14. Analysis of anisotropic shells containing flowing fluid

    International Nuclear Information System (INIS)

    Lakis, A.A.

    1983-01-01

    A general theory for the dynamic analysis of anisotropic thin cylindrical shells containing flowing fluid is presented. The shell may be uniform or non-uniform, provided it is geometrically axially symmetric. This is a finite- element theory, using cylindrical finite elements, but the displacement functions are determined by using classical shell theory. A new solution of the wave equation of the liquid finite element leads to an expression of the fluid pressure, p, as a function of the nodal displacements of the element and three operative forces (inertia, centrifugal and Coriolis) of the moving fluid. (Author) [pt

  15. Analysis of logging data from nuclear borehole tools

    International Nuclear Information System (INIS)

    Hovgaard, J.; Oelgaard, P.L.

    1989-12-01

    The processing procedure for logging data from a borehole of the Stenlille project of Dansk Naturgas A/S has been analysed. The tools considered in the analysis were an integral, natural-gamma tool, a neutron porosity tool, a gamma density tool and a caliper tool. It is believed that in most cases the processing procedure used by the logging company in the interpretation of the raw data is fully understood. An exception is the epithermal part of the neutron porosity tool where all data needed for an interpretation were not available. The analysis has shown that some parts of the interpretation procedure may not be consistent with the physical principle of the tools. (author)

  16. Nutrition screening tools: an analysis of the evidence.

    Science.gov (United States)

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  17. Structural analysis of ITER sub-assembly tools

    International Nuclear Information System (INIS)

    Nam, K.O.; Park, H.K.; Kim, D.J.; Ahn, H.J.; Lee, J.H.; Kim, K.K.; Im, K.; Shaw, R.

    2011-01-01

    The ITER Tokamak assembly tools are purpose-built assembly tools to complete the ITER Tokamak machine which includes the cryostat and the components contained therein. The sector sub-assembly tools descried in this paper are main assembly tools to assemble vacuum vessel, thermal shield and toroidal filed coils into a complete 40 o sector. The 40 o sector sub-assembly tools are composed of sector sub-assembly tool, including radial beam, vacuum vessel supports and mid-plane brace tools. These tools shall have sufficient strength to transport and handle heavy weight of the ITER Tokamak machine reached several hundred tons. Therefore these tools should be designed and analyzed to confirm both the strength and structural stability even in the case of conservative assumptions. To verify structural stabilities of the sector sub-assembly tools in terms of strength and deflection, ANSYS code was used for linear static analysis. The results of the analysis show that these tools are designed with sufficient strength and stiffness. The conceptual designs of these tools are briefly described in this paper also.

  18. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  19. The physics analysis tools project for the ATLAS experiment

    International Nuclear Information System (INIS)

    Lenzi, Bruno

    2012-01-01

    The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (≅1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis tool-kits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework. (authors)

  20. Rule-Based Multidisciplinary Tool for Unsteady Reacting Real-Fluid Flows, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Loci-STREAM is a CFD-based, multidisciplinary, high-fidelity design and analysis tool resulting from Phase I work whose objectives were: (a) to demonstrate the...

  1. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  2. A biological tool to assess flow connectivity in reference temporary streams from the Mediterranean Basin.

    Science.gov (United States)

    Cid, N; Verkaik, I; García-Roger, E M; Rieradevall, M; Bonada, N; Sánchez-Montoya, M M; Gómez, R; Suárez, M L; Vidal-Abarca, M R; Demartini, D; Buffagni, A; Erba, S; Karaouzas, I; Skoulikidis, N; Prat, N

    2016-01-01

    Many streams in the Mediterranean Basin have temporary flow regimes. While timing for seasonal drought is predictable, they undergo strong inter-annual variability in flow intensity. This high hydrological variability and associated ecological responses challenge the ecological status assessment of temporary streams, particularly when setting reference conditions. This study examined the effects of flow connectivity in aquatic macroinvertebrates from seven reference temporary streams across the Mediterranean Basin where hydrological variability and flow conditions are well studied. We tested for the effect of flow cessation on two streamflow indices and on community composition, and, by performing random forest and classification tree analyses we identified important biological predictors for classifying the aquatic state either as flowing or disconnected pools. Flow cessation was critical for one of the streamflow indices studied and for community composition. Macroinvertebrate families found to be important for classifying the aquatic state were Hydrophilidae, Simuliidae, Hydropsychidae, Planorbiidae, Heptageniidae and Gerridae. For biological traits, trait categories associated to feeding habits, food, locomotion and substrate relation were the most important and provided more accurate predictions compared to taxonomy. A combination of selected metrics and associated thresholds based on the most important biological predictors (i.e. Bio-AS Tool) were proposed in order to assess the aquatic state in reference temporary streams, especially in the absence of hydrological data. Although further development is needed, the tool can be of particular interest for monitoring, restoration, and conservation purposes, representing an important step towards an adequate management of temporary rivers not only in the Mediterranean Basin but also in other regions vulnerable to the effects of climate change. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Lean production tools and decision latitude enable conditions for innovative learning in organizations: a multilevel analysis.

    Science.gov (United States)

    Fagerlind Ståhl, Anna-Carin; Gustavsson, Maria; Karlsson, Nadine; Johansson, Gun; Ekberg, Kerstin

    2015-03-01

    The effect of lean production on conditions for learning is debated. This study aimed to investigate how tools inspired by lean production (standardization, resource reduction, visual monitoring, housekeeping, value flow analysis) were associated with an innovative learning climate and with collective dispersion of ideas in organizations, and whether decision latitude contributed to these associations. A questionnaire was sent out to employees in public, private, production and service organizations (n = 4442). Multilevel linear regression analyses were used. Use of lean tools and decision latitude were positively associated with an innovative learning climate and collective dispersion of ideas. A low degree of decision latitude was a modifier in the association to collective dispersion of ideas. Lean tools can enable shared understanding and collective spreading of ideas, needed for the development of work processes, especially when decision latitude is low. Value flow analysis played a pivotal role in the associations. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. A study of grout flow pattern analysis

    International Nuclear Information System (INIS)

    Lee, S. Y.; Hyun, S.

    2013-01-01

    A new disposal unit, designated as Salt Disposal Unit no. 6 (SDU6), is being designed for support of site accelerated closure goals and salt nuclear waste projections identified in the new Liquid Waste System plan. The unit is cylindrical disposal vault of 380 ft diameter and 43 ft in height, and it has about 30 million gallons of capacity. Primary objective was to develop the computational model and to perform the evaluations for the flow patterns of grout material in SDU6 as function of elevation of grout discharge port, and slurry rheology. A Bingham plastic model was basically used to represent the grout flow behavior. A two-phase modeling approach was taken to achieve the objective. This approach assumes that the air-grout interface determines the shape of the accumulation mound. The results of this study were used to develop the design guidelines for the discharge ports of the Saltstone feed materials in the SDU6 facility. The focusing areas of the modeling study are to estimate the domain size of the grout materials radially spread on the facility floor under the baseline modeling conditions, to perform the sensitivity analysis with respect to the baseline design and operating conditions such as elevation of discharge port, discharge pipe diameter, and grout properties, and to determine the changes in grout density as it is related to grout drop height. An axi-symmetric two-phase modeling method was used for computational efficiency. Based on the nominal design and operating conditions, a transient computational approach was taken to compute flow fields mainly driven by pumping inertia and natural gravity. Detailed solution methodology and analysis results are discussed here

  5. DECSERVIS-2: A tool for natural decay series mass flow simulation

    International Nuclear Information System (INIS)

    Azzam, Saad; Suksi, Juhani; Ammann, Michael

    2009-01-01

    After the publication of 'DECSERVIS: a tool for radioactive decay series visualisation' we have further developed our DECSERVIS software. With the new tool DECSERVIS-2 one can simulate radioactive decay chains in open systems, i.e. when the concentrations of nuclides change also due to mass flows. Decay chains can be simulated under continuous and successive nuclide mass flow events into and out from the system and in freely determined time intervals. Simulation output for the entire decay chain (nuclide activity, mass, number of nuclides, nuclide ratios) can be presented as a function of time with various graphical presentations such as solid curve and column diagrams or animation. In this paper we introduce DECSERVIS-2 and demonstrate its use with simulation examples. DECSERVIS-2 is easy to use and has been designed with an eye on the demands of teaching.

  6. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis in ...

  7. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  8. Unsaturated Zone Flow Patterns and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    C. Ahlers

    2001-10-17

    This Analysis/Model Report (AMR) documents the development of an expected-case model for unsaturated zone (UZ) flow and transport that will be described in terms of the representativeness of models of the natural system. The expected-case model will provide an evaluation of the effectiveness of the natural barriers, assess the impact of conservatism in the Total System Performance Assessment (TSPA), and support the development of further models and analyses for public confidence building. The present models used in ''Total System Performance Assessment for the Site Recommendation'' (Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M&O) 2000 [1532461]) underestimate the natural-barrier performance because of conservative assumptions and parameters and do not adequately address uncertainty and alternative models. The development of an expected case model for the UZ natural barrier addresses issues regarding flow-pattern analysis and modeling that had previously been treated conservatively. This is in line with the Repository Safety Strategy (RSS) philosophy of treating conservatively those aspects of the UZ flow and transport system that are not important for achieving regulatory dose (CRWMS M&O 2000 [153246], Section 1.1.1). The development of an expected case model for the UZ also provides defense-in-depth in areas requiring further analysis of uncertainty and alternative models. In general, the value of the conservative case is to provide a more easily defensible TSPA for behavior of UZ flow and transport processes at Yucca Mountain. This AMR has been prepared in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (Bechtel SAIC Company (BSC) 2001 [155051], Section 1.3 - Work Package 4301213UMG). The work scope is to examine the data and current models of flow and transport in the Yucca Mountain UZ to identify models and analyses

  9. Unsaturated Zone Flow Patterns and Analysis

    International Nuclear Information System (INIS)

    Ahlers, C.

    2001-01-01

    This Analysis/Model Report (AMR) documents the development of an expected-case model for unsaturated zone (UZ) flow and transport that will be described in terms of the representativeness of models of the natural system. The expected-case model will provide an evaluation of the effectiveness of the natural barriers, assess the impact of conservatism in the Total System Performance Assessment (TSPA), and support the development of further models and analyses for public confidence building. The present models used in ''Total System Performance Assessment for the Site Recommendation'' (Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M and O) 2000 [1532461]) underestimate the natural-barrier performance because of conservative assumptions and parameters and do not adequately address uncertainty and alternative models. The development of an expected case model for the UZ natural barrier addresses issues regarding flow-pattern analysis and modeling that had previously been treated conservatively. This is in line with the Repository Safety Strategy (RSS) philosophy of treating conservatively those aspects of the UZ flow and transport system that are not important for achieving regulatory dose (CRWMS M and O 2000 [153246], Section 1.1.1). The development of an expected case model for the UZ also provides defense-in-depth in areas requiring further analysis of uncertainty and alternative models. In general, the value of the conservative case is to provide a more easily defensible TSPA for behavior of UZ flow and transport processes at Yucca Mountain. This AMR has been prepared in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (Bechtel SAIC Company (BSC) 2001 [155051], Section 1.3 - Work Package 4301213UMG). The work scope is to examine the data and current models of flow and transport in the Yucca Mountain UZ to identify models and analyses where conservatism may be

  10. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  11. Load flow analysis using decoupled fuzzy load flow under critical ...

    African Journals Online (AJOL)

    user

    3.1 Maximum range selection of input and output variables: ..... Wong K. P., Li A., and Law M.Y., “ Advanced Constrained Genetic Algorithm Load Flow Method”, IEE Proc. ... Dr. Parimal Acharjee passed B.E.E. from North Bengal University ...

  12. A Lexical Analysis Tool with Ambiguity Support

    OpenAIRE

    Quesada, Luis; Berzal, Fernando; Cortijo, Francisco J.

    2012-01-01

    Lexical ambiguities naturally arise in languages. We present Lamb, a lexical analyzer that produces a lexical analysis graph describing all the possible sequences of tokens that can be found within the input string. Parsers can process such lexical analysis graphs and discard any sequence of tokens that does not produce a valid syntactic sentence, therefore performing, together with Lamb, a context-sensitive lexical analysis in lexically-ambiguous language specifications.

  13. Computer program for compressible flow network analysis

    Science.gov (United States)

    Wilton, M. E.; Murtaugh, J. P.

    1973-01-01

    Program solves problem of an arbitrarily connected one dimensional compressible flow network with pumping in the channels and momentum balancing at flow junctions. Program includes pressure drop calculations for impingement flow and flow through pin fin arrangements, as currently found in many air cooled turbine bucket and vane cooling configurations.

  14. Time Analysis: Still an Important Accountability Tool.

    Science.gov (United States)

    Fairchild, Thomas N.; Seeley, Tracey J.

    1994-01-01

    Reviews benefits to school counselors of conducting a time analysis. Describes time analysis system that authors have used, including case illustration of how authors used data to effect counseling program changes. System described followed process outlined by Fairchild: identifying services, devising coding system, keeping records, synthesizing…

  15. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  16. Substance flow analysis in Finland - Four case studies on N and P flows

    Energy Technology Data Exchange (ETDEWEB)

    Antikainen, R.

    2007-07-01

    Nitrogen (N) and phosphorus (P) are essential elements for all living organisms. However, in excess, they contribute to such environmental problems as aquatic and terrestrial eutrophication (N, P), acidification (N), global warming (N), groundwater pollution (N), depletion of stratospheric ozone (N), formulation of tropospheric ozone (N) and poor urban air quality (N). Globally, human action has multiplied the volume of N and P cycling since the onset of industrialization. Themultiplication is a result of intensified agriculture, increased energy consumption and population growth. Industrial ecology (IE) is a discipline, in which human interaction with the ecosystems is investigated using a systems analytical approach. The main idea behind IE is that industrial systems resemble ecosystems, and, like them, industrial systems can then be described using material, energy and information flows and stocks. Industrial systems are dependent on the resources provided by the biosphere, and these two cannot be separated from each other. When studying substance flows, the aims of the research from the viewpoint of IE can be, for instance, to elucidate the ways how the cycles of a certain substance could be more closed and how the flows of a certain substance could be decreased per unit of production (= dematerialization). IE uses analytical research tools such as material and substance flow analysis (MFA, SFA), energy flow analysis (EFA), life cycle assessment (LCA) and material input per service unit (MIPS). In Finland, N and P are studied widely in different ecosystems and environmental emissions. A holistic picture comparing different societal systems is, however, lacking. In this thesis, flows of N and P were examined in Finland using SFA in the following four subsystems: (I) forest industry and use of wood fuels, II) food production and consumption, III) energy, and IV) municipal waste. A detailed analysis at the end of the 1990s was performed. Furthermore, historical

  17. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  18. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  19. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...... workflow and computational resources. The current version of the tool implements an extension to previous work in that it allows for general workflow and resource bounds and provides optimal solutions even to systems with cyclic dependencies. Despite the virtues of the current tool, improvements...... and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  20. Flow status of three transboundary rivers in Northern Greece as a tool for hydro-diplomacy

    Science.gov (United States)

    Hatzigiannakis, Eyaggelos; Hatzispiroglou, Ioannis; Arampatzis, Georgios; Ilia, Andreas; Pantelakis, Dimitrios; Filintas, Agathos; Panagopoulos, Andreas

    2015-04-01

    The aim of this paper is to examine how the river flow monitoring consists a tool for hydro-diplomacy. Management of transboundary catchments and the demand of common water resources, often comprise the cause of conflicts and tension threatening the peaceful coexistence of nations. The Water Framework Directive 2000/60/EU sets a base for water management contributing to common approaches, common goals, common principles as well as providing new definitions and measures for Europe's water resources. In northern Greece the main renewable resources are "imported" (over 25% of its water reserves) and for this reason the implementation of continuous flow measurements throughout the year is necessary, even though difficult to achieve. This paper focuses on the three largest transboundary rivers in Northern Greece. Axios and Strymonas river flow across the region of Central Macedonia in Northern Greece. Axios flows from FYROM to Greece, and Strymonas from Bulgaria to Greece. Nestos river flows from Bulgaria to Greece. The Greek part is in the region of Eastern Macedonia and Thrace in Northern Greece. Significant productive agricultural areas around these rivers are irrigated from them so they are very important for the local society. Measurements of the river flow velocity and the flow depth have been made at bridges. The frequency of the measurements is roughly monthly, because it is expected a significant change in the depth flow and discharge. A series of continuously flow measure-ments were performed during 2013 and 2014 using flowmeters (Valeport and OTT type). The cross-section characteristics, the river flow velocity of segments and the mean water flow velocity and discharge total profile were measured and calculated re-spectively. Measurements are conducted in the framework of the national water resources monitoring network, which is realised in compliance to the Water Framework Directive under the supervision and coordination of the Hellenic Ministry for the

  1. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  2. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  3. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  4. Nonlinear analysis of river flow time sequences

    Science.gov (United States)

    Porporato, Amilcare; Ridolfi, Luca

    1997-06-01

    Within the field of chaos theory several methods for the analysis of complex dynamical systems have recently been proposed. In light of these ideas we study the dynamics which control the behavior over time of river flow, investigating the existence of a low-dimension deterministic component. The present article follows the research undertaken in the work of Porporato and Ridolfi [1996a] in which some clues as to the existence of chaos were collected. Particular emphasis is given here to the problem of noise and to nonlinear prediction. With regard to the latter, the benefits obtainable by means of the interpolation of the available time series are reported and the remarkable predictive results attained with this nonlinear method are shown.

  5. Flow boiling in microgap channels experiment, visualization and analysis

    CERN Document Server

    Alam, Tamanna; Jin, Li-Wen

    2013-01-01

    Flow Boiling in Microgap Channels: Experiment, Visualization and Analysis presents an up-to-date summary of the details of the confined to unconfined flow boiling transition criteria, flow boiling heat transfer and pressure drop characteristics, instability characteristics, two phase flow pattern and flow regime map and the parametric study of microgap dimension. Advantages of flow boiling in microgaps over microchannels are also highlighted. The objective of this Brief is to obtain a better fundamental understanding of the flow boiling processes, compare the performance between microgap and c

  6. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  7. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  8. Bayesian data analysis tools for atomic physics

    Science.gov (United States)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  9. ResStock Analysis Tool | Buildings | NREL

    Science.gov (United States)

    Energy and Cost Savings for U.S. Homes Contact Eric Wilson to learn how ResStock can benefit your approach to large-scale residential energy analysis by combining: Large public and private data sources uncovered $49 billion in potential annual utility bill savings through cost-effective energy efficiency

  10. LEAP2000: tools for sustainable energy analysis

    Energy Technology Data Exchange (ETDEWEB)

    Heaps, C.; Lazarus, M.; Raskin, P. [SEU-Boston, Boston, MA (USA)

    2000-09-01

    LEAP2000 is a collaborative initiative, led by the Boston Center for the Stockholm Environment Institute, to create a new suite of analytical software and databases for integrated energy-environment analysis. The LEAP2000 software and the Technology and Environmental Database (TED) are described. 5 refs., 5 figs.

  11. Spreadsheet as a tool of engineering analysis

    International Nuclear Information System (INIS)

    Becker, M.

    1985-01-01

    In engineering analysis, problems tend to be categorized into those that can be done by hand and those that require the computer for solution. The advent of personal computers, and in particular, the advent of spreadsheet software, blurs this distinction, creating an intermediate category of problems appropriate for use with interactive personal computing

  12. A tool to estimate bar patterns and flow conditions in estuaries when limited data is available

    Science.gov (United States)

    Leuven, J.; Verhoeve, S.; Bruijns, A. J.; Selakovic, S.; van Dijk, W. M.; Kleinhans, M. G.

    2017-12-01

    The effects of human interventions, natural evolution of estuaries and rising sea-level on food security and flood safety are largely unknown. In addition, ecologists require quantified habitat area to study future evolution of estuaries, but they lack predictive capability of bathymetry and hydrodynamics. For example, crucial input required for ecological models are values of intertidal area, inundation time, peak flow velocities and salinity. While numerical models can reproduce these spatial patterns, their computational times are long and for each case a new model must be developed. Therefore, we developed a comprehensive set of relations that accurately predict the hydrodynamics and the patterns of channels and bars, using a combination of the empirical relations derived from approximately 50 estuaries and theory for bars and estuaries. The first step is to predict local tidal prisms, which is the tidal prism that flows through a given cross-section. Second, the channel geometry is predicted from tidal prism and hydraulic geometry relations. Subsequently, typical flow velocities can be estimated from the channel geometry and tidal prism. Then, an ideal estuary shape is fitted to the measured planform: the deviation from the ideal shape, which is defined as the excess width, gives a measure of the locations where tidal bars form and their summed width (Leuven et al., 2017). From excess width, typical hypsometries can be predicted per cross-section. In the last step, flow velocities are calculated for the full range of occurring depths and salinity is calculated based on the estuary shape. Here, we will present a prototype tool that predicts equilibrium bar patterns and typical flow conditions. The tool is easy to use because the only input required is the estuary outline and tidal amplitude. Therefore it can be used by policy makers and researchers from multiple disciplines, such as ecologists, geologists and hydrologists, for example for paleogeographic

  13. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  14. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  15. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  16. Surface Operations Data Analysis and Adaptation Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort undertook the creation of a Surface Operations Data Analysis and Adaptation (SODAA) tool to store data relevant to airport surface research and...

  17. Mean streamline analysis for performance prediction of cross-flow fans

    International Nuclear Information System (INIS)

    Kim, Jae Won; Oh, Hyoung Woo

    2004-01-01

    This paper presents the mean streamline analysis using the empirical loss correlations for performance prediction of cross-flow fans. Comparison of overall performance predictions with test data of a cross-flow fan system with a simplified vortex wall scroll casing and with the published experimental characteristics for a cross-flow fan has been carried out to demonstrate the accuracy of the proposed method. Predicted performance curves by the present mean streamline analysis agree well with experimental data for two different cross-flow fans over the normal operating conditions. The prediction method presented herein can be used efficiently as a tool for the preliminary design and performance analysis of general-purpose cross-flow fans

  18. Technical discussions II - Flow cytometric analysis

    NARCIS (Netherlands)

    Cunningham, A; Cid, A; Buma, AGJ

    In this paper the potencial of flow cytometry as applied to the aquatic life sciences is discussed. The use of flow cytometry for studying the ecotoxicology of phytoplankton was introduced. On the other hand, the new flow cytometer EUROPA was presented. This is a multilaser machine which has been

  19. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  20. SBOAT: A Stochastic BPMN Analysis and Optimisation Tool

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching and parame......In this paper we present a description of a tool development framework, called SBOAT, for the quantitative analysis of graph based process modelling languages based upon the Business Process Modelling and Notation (BPMN) language, extended with intention preserving stochastic branching...

  1. Effect of pin tool design on the material flow of dissimilar AA7075-AA6061 friction stir welds

    Science.gov (United States)

    Hasan, Mohammed M.; Ishak, M.; Rejab, M. R. M.

    2017-10-01

    Tool design is the most influential aspect in the friction stir welding (FSW) technology. Influence of pin tool geometry on material flow pattern are studied in this work during the FSW of dissimilar AA7075 and AA6061 aluminium alloys. Three truncated pin tool profiles (threaded, threaded with single flat, and unthreaded with single flat) were used to prepare the weldments. The workpieces were joined using a custom-made clamping system under 1100 rpm of spindle speed, 300 mm/min of traverse rate and 3° of tilt angle. The metallographic analysis showed that defect-free welds can be produced using the three pin tools with significant changes in the mixing stir zone structure. The results declared that the introducing of the flat on the cone of the probe deviates the pattern of the onion rings without changing the chemical composition of the created layers. This in turn improves the hardness distribution and tensile strength of the welded joint. It was also noted that both heat affected zone (HAZ) and thermal-mechanical affected zone (TMAZ) are similar in composition to their corresponding base materials (BM).

  2. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  3. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  4. SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

    Science.gov (United States)

    Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory

    2018-03-07

    This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.

  5. MINIMUM QUANTITY LUBRICANT FLOW ANALYSIS IN END MILLING PROCESSES: A COMPUTATIONAL FLUID DYNAMICS APPROACH

    Directory of Open Access Journals (Sweden)

    M. S. Najiha

    2012-12-01

    Full Text Available This paper presents a two-dimensional steady-state incompressible analysis for the minimum quantity of lubricant flow in milling operations using a computational fluid dynamics (CFD approach. The analysis of flow and heat transfer in a four-teeth milling cutter operation was undertaken. The domain of the rotating cutter along with the spray nozzle is defined. Operating cutting and boundary conditions are taken from the literature. A steady-state, pressure-based, planar analysis was performed with a viscous, realizable k-ε model. A mixture of oils and air were sprayed on the tool, which is considered to be rotating and is at a temperature near the melting temperature of the workpiece. Flow fields are obtained from the study. The vector plot of the flow field shows that the flow is not evenly distributed over the cutter surface, as well as the uneven distribution of the lubricant in the direction of the cutter rotation. It can be seen that the cutting fluid has not completely penetrated the tool edges. The turbulence created by the cutter rotation in the proximity of the tool throws oil drops out of the cutting zone. The nozzle position in relation to the feed direction is very important in order to obtain the optimum effect of the MQL flow.

  6. Advanced tools for in vivo skin analysis.

    Science.gov (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  7. NMR spectroscopy: a tool for conformational analysis

    International Nuclear Information System (INIS)

    Tormena, Claudio F.; Cormanich, Rodrigo A.; Rittner, Roberto; Freitas, Matheus P.

    2011-01-01

    The present review deals with the application of NMR data to the conformational analysis of simple organic compounds, together with other experimental methods like infrared spectroscopy and with theoretical calculations. Each sub-section describes the results for a group of compounds which belong to a given organic function like ketones, esters, etc. Studies of a single compound, even of special relevance, were excluded since the main goal of this review is to compare the results for a given function, where different substituents were used or small structural changes were introduced in the substrate, in an attempt to disclose their effects in the conformational equilibrium. Moreover, the huge amount of data available in the literature, on this research field, imposed some limitations which will be detailed in the Introduction, but it can be reminded in advance that these limitations include mostly the period when these results were published. (author)

  8. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  9. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  10. OPR1000 RCP Flow Coastdown Analysis using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Hyuk; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Korean nuclear industry developed a thermal-hydraulic analysis code for the safety analysis of PWRs, named SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). Current loss of flow transient analysis of OPR1000 uses COAST code to calculate transient RCS(Reactor Coolant System) flow. The COAST code calculates RCS loop flow using pump performance curves and RCP(Reactor Coolant Pump) inertia. In this paper, SPACE code is used to reproduce RCS flowrates calculated by COAST code. The loss of flow transient is transient initiated by reduction of forced reactor coolant circulation. Typical loss of flow transients are complete loss of flow(CLOF) and locked rotor(LR). OPR1000 RCP flow coastdown analysis was performed using SPACE using simplified nodalization. Complete loss of flow(4 RCP trip) was analyzed. The results show good agreement with those from COAST code, which is CE code for calculating RCS flow during loss of flow transients. Through this study, we confirmed that SPACE code can be used instead of COAST code for RCP flow coastdown analysis.

  11. Power flow analysis for DC voltage droop controlled DC microgrids

    DEFF Research Database (Denmark)

    Li, Chendan; Chaudhary, Sanjay; Dragicevic, Tomislav

    2014-01-01

    This paper proposes a new algorithm for power flow analysis in droop controlled DC microgrids. By considering the droop control in the power flow analysis for the DC microgrid, when compared with traditional methods, more accurate analysis results can be obtained. The algorithm verification is ca...

  12. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  13. Multivariate analysis of flow cytometric data using decision trees.

    Science.gov (United States)

    Simon, Svenja; Guthke, Reinhard; Kamradt, Thomas; Frey, Oliver

    2012-01-01

    Characterization of the response of the host immune system is important in understanding the bidirectional interactions between the host and microbial pathogens. For research on the host site, flow cytometry has become one of the major tools in immunology. Advances in technology and reagents allow now the simultaneous assessment of multiple markers on a single cell level generating multidimensional data sets that require multivariate statistical analysis. We explored the explanatory power of the supervised machine learning method called "induction of decision trees" in flow cytometric data. In order to examine whether the production of a certain cytokine is depended on other cytokines, datasets from intracellular staining for six cytokines with complex patterns of co-expression were analyzed by induction of decision trees. After weighting the data according to their class probabilities, we created a total of 13,392 different decision trees for each given cytokine with different parameter settings. For a more realistic estimation of the decision trees' quality, we used stratified fivefold cross validation and chose the "best" tree according to a combination of different quality criteria. While some of the decision trees reflected previously known co-expression patterns, we found that the expression of some cytokines was not only dependent on the co-expression of others per se, but was also dependent on the intensity of expression. Thus, for the first time we successfully used induction of decision trees for the analysis of high dimensional flow cytometric data and demonstrated the feasibility of this method to reveal structural patterns in such data sets.

  14. Numerical Tools for Multicomponent, Multiphase, Reactive Processes: Flow of CO{sub 2} in Porous Medium

    Energy Technology Data Exchange (ETDEWEB)

    Khattri, Sanjay Kumar

    2006-07-01

    The thesis is concerned with numerically simulating multicomponent, multiphase, reactive transport in heterogeneous porous medium. Such processes are ubiquitous, for example, deposition of green house gases, flow of hydrocarbons and groundwater remediation. Understanding such processes is important from social and economic point of view. For the success of geological sequestration, an accurate estimation of migration patterns of green-house gases is essential. Due to an ever increasing computer power, computational mathematics has become an important tool for predicting dynamics of porous media fluids. Numerical and mathematical modelling of processes in a domain requires grid generation in the domain, discretization of the continuum equations on the generated grid, solution of the formed linear or nonlinear system of discrete equations and finally visualization of the results. The thesis is composed of three chapters and eight papers. Chapter 2 presents two techniques for generating structured quadrilateral and hexahedral meshes. These techniques are called algebraic and elliptic methods. Algebraic techniques are by far the most simple and computationally efficient method for grid generation. Transfinite interpolation operators are a kind of algebraic grid generation technique. In this chapter, many transfinite interpolation operators for grid generation are derived from 1D projection operators. In this chapter, some important properties of hexahedral elements are also mentioned. These properties are useful in discretization of partial differential equations on hexahedral mesh, improving quality of the hexahedral mesh, mesh generation and visualization. Chapter 3 is about CO{sub 2} flow in porous media. In this chapter, we present the mathematical models and their discretization for capturing major physical processes associated with CO{sub 2} deposition in geological formations. Some important simulations of practical applications in 2D and 3D are presented

  15. Numerical Tools for Multicomponent, Multiphase, Reactive Processes: Flow of CO{sub 2} in Porous Medium

    Energy Technology Data Exchange (ETDEWEB)

    Khattri, Sanjay Kumar

    2006-07-01

    The thesis is concerned with numerically simulating multicomponent, multiphase, reactive transport in heterogeneous porous medium. Such processes are ubiquitous, for example, deposition of green house gases, flow of hydrocarbons and groundwater remediation. Understanding such processes is important from social and economic point of view. For the success of geological sequestration, an accurate estimation of migration patterns of green-house gases is essential. Due to an ever increasing computer power, computational mathematics has become an important tool for predicting dynamics of porous media fluids. Numerical and mathematical modelling of processes in a domain requires grid generation in the domain, discretization of the continuum equations on the generated grid, solution of the formed linear or nonlinear system of discrete equations and finally visualization of the results. The thesis is composed of three chapters and eight papers. Chapter 2 presents two techniques for generating structured quadrilateral and hexahedral meshes. These techniques are called algebraic and elliptic methods. Algebraic techniques are by far the most simple and computationally efficient method for grid generation. Transfinite interpolation operators are a kind of algebraic grid generation technique. In this chapter, many transfinite interpolation operators for grid generation are derived from 1D projection operators. In this chapter, some important properties of hexahedral elements are also mentioned. These properties are useful in discretization of partial differential equations on hexahedral mesh, improving quality of the hexahedral mesh, mesh generation and visualization. Chapter 3 is about CO{sub 2} flow in porous media. In this chapter, we present the mathematical models and their discretization for capturing major physical processes associated with CO{sub 2} deposition in geological formations. Some important simulations of practical applications in 2D and 3D are presented

  16. Power flow as a complement to statistical energy analysis and finite element analysis

    Science.gov (United States)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  17. Boolean logic analysis for flow regime recognition of gas–liquid horizontal flow

    International Nuclear Information System (INIS)

    Ramskill, Nicholas P; Wang, Mi

    2011-01-01

    In order to develop a flowmeter for the accurate measurement of multiphase flows, it is of the utmost importance to correctly identify the flow regime present to enable the selection of the optimal method for metering. In this study, the horizontal flow of air and water in a pipeline was studied under a multitude of conditions using electrical resistance tomography but the flow regimes that are presented in this paper have been limited to plug and bubble air–water flows. This study proposes a novel method for recognition of the prevalent flow regime using only a fraction of the data, thus rendering the analysis more efficient. By considering the average conductivity of five zones along the central axis of the tomogram, key features can be identified, thus enabling the recognition of the prevalent flow regime. Boolean logic and frequency spectrum analysis has been applied for flow regime recognition. Visualization of the flow using the reconstructed images provides a qualitative comparison between different flow regimes. Application of the Boolean logic scheme enables a quantitative comparison of the flow patterns, thus reducing the subjectivity in the identification of the prevalent flow regime

  18. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  19. Analysis of the three dimensional flow in a turbine scroll

    Science.gov (United States)

    Hamed, A.; Baskharone, E.

    1979-01-01

    The present analysis describes the three-dimensional compressible inviscid flow in the scroll and the vaneless nozzle of a radial inflow turbine. The solution to this flow field, which is further complicated by the geometrical shape of the boundaries, is obtained using the finite element method. Symmetric and nonsymmetric scroll cross sectional geometries are investigated to determine their effect on the general flow field and on the exit flow conditions.

  20. Network Analysis Tools: from biological networks to clusters and pathways.

    Science.gov (United States)

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  1. Tool for efficient intermodulation analysis using conventional HB packages

    OpenAIRE

    Vannini, G.; Filicori, F.; Traverso, P.

    1999-01-01

    A simple and efficient approach is proposed for the intermodulation analysis of nonlinear microwave circuits. The algorithm, which is based on a very mild assumption about the frequency response of the linear part of the circuit, allows for a reduction in computing time and memory requirement. Moreover. It can be easily implemented using any conventional tool for harmonic-balance circuit analysis

  2. Gender analysis of use of participatory tools among extension workers

    African Journals Online (AJOL)

    (c2 = 0.833, p = 0.361; t = 0.737, p = 0.737, CC = 0.396) Participatory tools used by both male and female extension personnel include resource map, mobility map, transect map, focus group discussion, venn diagram, seasonal calendar, SWOT analysis, semistructured interview, daily activity schedule, resource analysis, ...

  3. ANALYSIS OF TRANSONIC FLOW PAST CUSPED AIRFOILS

    Directory of Open Access Journals (Sweden)

    Jiří Stodůlka

    2015-06-01

    Full Text Available Transonic flow past two cusped airfoils is numerically solved and achieved results are analyzed by means of flow behavior and oblique shocks formation.Regions around sharp trailing edges are studied in detail and parameters of shock waves are solved and compared using classical shock polar approach and verified by reduction parameters for symmetric configurations.

  4. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  5. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...

  6. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  7. Analysis of seawater flow through optical fiber

    Science.gov (United States)

    Fernández López, Sheila; Carrera Ramírez, Jesús; Rodriguez Sinobar, Leonor; Benitez, Javier; Rossi, Riccardo; Laresse de Tetto, Antonia

    2015-04-01

    The relation between sea and coastal aquifer is very important to the human populations living in coastal areas. The interrelation involves the submarine ground water discharge of relatively fresh water to the sea and the intrusion of sea water into the aquifer, which impairs the quality of ground water. The main process in seawater intrusion is managed by fluid-density effects which control the displacement of saline water. The underlain salinity acts as the restoring force, while hydrodynamic dispersion and convection lead to a mixing and vertical displacement of the brine. Because of this, a good definition of this saltwater-freshwater interface is needed what is intimately joined to the study of the movements (velocity fields) of fresh and salt water. As it is well known, the flow of salt water studied in seawater intrusion in stationary state, is nearly null or very low. However, in the rest of cases, this flux can be very important, so it is necessary its study to a better comprehension of this process. One possible manner of carry out this analysis is through the data from optical fiber. So, to research the distribution and velocity of the fresh and saltwater in the aquifer, a fiber optic system (OF) has been installed in Argentona (Baix Maresme, Catalonia). The main objective is to obtain the distributed temperature measurements (OF-DTS) and made progress in the interpretation of the dynamic processes of water. For some applications, the optical fiber acts as a passive temperature sensor but in our case, the technique Heated Active Fiber Optic will be used. This is based on the thermal response of the ground as a heat emission source is introduced. The thermal properties of the soil, dependent variables of soil water content, will make a specific temperature distribution around the cable. From the analyzed data we will deduce the velocity field, the real objective of our problem. To simulate this phenomenon and the coupled transport and flow problem

  8. MTpy - Python Tools for Magnetotelluric Data Processing and Analysis

    Science.gov (United States)

    Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes

    2014-05-01

    We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

  9. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  10. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  11. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  12. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  13. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  14. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  15. Cryogenic recovery analysis of forced flow supercritical helium cooled superconductors

    International Nuclear Information System (INIS)

    Lee, A.Y.

    1977-08-01

    A coupled heat conduction and fluid flow method of solution was presented for cryogenic stability analysis of cabled composite superconductors of large scale magnetic coils. The coils are cooled by forced flow supercritical helium in parallel flow channels. The coolant flow reduction in one of the channels during the spontaneous recovery transient, after the conductor undergoes a transition from superconducting to resistive, necessitates a parallel channel analysis. A way to simulate the parallel channel analysis is described to calculate the initial channel inlet flow rate required for recovery after a given amount of heat is deposited. The recovery capability of a NbTi plus copper composite superconductor design is analyzed and the results presented. If the hydraulics of the coolant flow is neglected in the recovery analysis, the recovery capability of the superconductor will be over-predicted

  16. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  17. International Trade Modelling Using Open Flow Networks: A Flow-Distance Based Analysis.

    Science.gov (United States)

    Shen, Bin; Zhang, Jiang; Li, Yixiao; Zheng, Qiuhua; Li, Xingsen

    2015-01-01

    This paper models and analyzes international trade flows using open flow networks (OFNs) with the approaches of flow distances, which provide a novel perspective and effective tools for the study of international trade. We discuss the establishment of OFNs of international trade from two coupled viewpoints: the viewpoint of trading commodity flow and that of money flow. Based on the novel model with flow distance approaches, meaningful insights are gained. First, by introducing the concepts of trade trophic levels and niches, countries' roles and positions in the global supply chains (or value-added chains) can be evaluated quantitatively. We find that the distributions of trading "trophic levels" have the similar clustering pattern for different types of commodities, and summarize some regularities between money flow and commodity flow viewpoints. Second, we find that active and competitive countries trade a wide spectrum of products, while inactive and underdeveloped countries trade a limited variety of products. Besides, some abnormal countries import many types of goods, which the vast majority of countries do not need to import. Third, harmonic node centrality is proposed and we find the phenomenon of centrality stratification. All the results illustrate the usefulness of the model of OFNs with its network approaches for investigating international trade flows.

  18. Low flow and drought spatial analysis

    International Nuclear Information System (INIS)

    Dakova, Snejana

    2004-01-01

    The hydrological characteristics of Bulgarian rivers reflect to the climate variability. Nearly all precipitation is received during the spring and/or winter months, with negligible precipitations in summer. Thus, peak flows occur in spring and/or winter, and during the summer, the flow is significant lower with many rivers being ephemeral. Therefore, 2210 reservoirs for satisfaction the water needs have been constructed during the last sixty years. In spit of that, Bulgaria is facing to a new insufficiency of water. The recent climate change investigations and climate scenarios determine the area of Balkan Peninsula as territories with decreasing of rainfalls and increasing of air temperature. In view of that, research the low flow in the light of climate changing together with the water management is required. In this study the definitions of low flow and drought are developed using available data obtained in Bulgarian area, which has semiarid zone conditions. The difference between the terms of drought and low flow is describing and clarified also. The low flow and drought variables are investigated on two levels: first on long-year's variability using annual data and than monthly and seasonal data series-for enabling the within-year effects to be determined. The relationship between the probability of river's dry up and mean annual and seasonal rainfalls is quantified using multiple regressions applied to logarithmic- transformed data. This paper presets also analyses of minimum flow series with zero values. The exceed probability above which stream flow is zero and conditional probability of non-zero flow (non-zero-duration curve) is obtained by the principals of total probability. A different kind of adjusting duration curves are proposed depending of the number of zero values in the series.(Author)

  19. Physics analysis tools for beauty physics in ATLAS

    International Nuclear Information System (INIS)

    Anastopoulos, C; Bouhova-Thacker, E; Catmore, J; Mora, L de; Dallison, S; Derue, F; Epp, B; Jussel, P; Kaczmarska, A; Radziewski, H v; Stahl, T; Reznicek, P

    2008-01-01

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools

  20. Radiometric flow injection analysis with an ASIA (Ismatec) analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Myint, U; Win, N; San, K; Han, B; Myoe, K M [Yangon Univ. (Myanmar). Dept. of Chemistry; Toelgyessy, J [Slovak Technical Univ., Bratislava (Slovakia). Dept. of Environmental Science

    1994-07-01

    Radiometric Flow Injection Analysis of a radioactive ([sup 131]I) sample is described. For analysis an ASIA (Ismatec) analyzer with a NaI(Tl) scintillation detector was used. (author) 5 refs.; 3 figs.

  1. Application of Recurrence Analysis to the period doubling cascade of a confined buoyant flow

    International Nuclear Information System (INIS)

    Angeli, D; Corticelli, M A; Fichera, A; Pagano, A

    2017-01-01

    Recurrence Analysis (RA) is a promising and flexible tool to identify the behaviour of nonlinear dynamical systems. The potentialities of such a technique are explored in the present work, for the study of transitions to chaos of buoyant flow in enclosures. The case of a hot cylindrical source centred in a square enclosure, is considered here, for which an extensive database of results has been collected in recent years. For a specific value of the system aspect ratio, a sequence of period doublings has been identified, leading to the onset of chaos. RA is applied here to analyse the different flow regimes along the route to chaos. The qualitative visual identification of patterns and the statistics given by the quantitative analysis suggest that this kind of tool is well suited to the study of transitional flows in thermo-fluid dynamics. (paper)

  2. Analysis of flow coefficient in chair manufacture

    OpenAIRE

    Ivković Dragoljub; Živković Slaven

    2005-01-01

    The delivery on time is not possible without the good-quality planning of deadlines, i.e. planning of the manufacturing process duration. The study of flow coefficient enables the realistic forecasting of the manufacturing process duration. This paper points to the significance of the study of flow coefficient on scientific basis so as to determine the terms of the end of the manufacture of chairs made of sawn timber. Chairs are the products of complex construction, often almost completely ma...

  3. Analysis of design tool attributes with regards to sustainability benefits

    Science.gov (United States)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  4. Stress Analysis of Fuel Rod under Axial Coolant Flow

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Hai Lan; Lee, Young Shin; Lee, Hyun Seung [Chungnam National University, Daejeon (Korea, Republic of); Park, Num Kyu; Jeon, Kyung Rok [Kerea Nuclear Fuel., Daejeon (Korea, Republic of)

    2010-05-15

    A pressurized water reactor(PWR) fuel assembly, is a typical bundle structure, which uses light water as a coolant in most commercial nuclear power plants. Fuel rods that have a very slender and long clad are supported by fuel assembly which consists of several spacer grids. A coolant is a fluid which flows through device to prevent its overheating, transferring the heat produced by the device to other devices that use or dissipate it. But at the same time, the coolant flow will bring out the fluid induced vibration(FIV) of fuel rods and even damaged the fuel rod. This study has been conducted to investigate the flow characteristics and nuclear reactor fuel rod stress under effect of coolant. Fluid structure interaction(FSI) analysis on nuclear reactor fuel rod was performed. Fluid analysis of the coolant which flow along the axial direction and structural analysis under effect of flow velocity were carried out under different output flow velocity conditions

  5. Stress Analysis of Fuel Rod under Axial Coolant Flow

    International Nuclear Information System (INIS)

    Jin, Hai Lan; Lee, Young Shin; Lee, Hyun Seung; Park, Num Kyu; Jeon, Kyung Rok

    2010-01-01

    A pressurized water reactor(PWR) fuel assembly, is a typical bundle structure, which uses light water as a coolant in most commercial nuclear power plants. Fuel rods that have a very slender and long clad are supported by fuel assembly which consists of several spacer grids. A coolant is a fluid which flows through device to prevent its overheating, transferring the heat produced by the device to other devices that use or dissipate it. But at the same time, the coolant flow will bring out the fluid induced vibration(FIV) of fuel rods and even damaged the fuel rod. This study has been conducted to investigate the flow characteristics and nuclear reactor fuel rod stress under effect of coolant. Fluid structure interaction(FSI) analysis on nuclear reactor fuel rod was performed. Fluid analysis of the coolant which flow along the axial direction and structural analysis under effect of flow velocity were carried out under different output flow velocity conditions

  6. Analysis and design of flow limiter used in steam generator

    International Nuclear Information System (INIS)

    Liu Shixun; Gao Yongjun

    1995-10-01

    Flow limiter is an important safety component of PWR steam generator. It can limit the blowdown rate of steam generator inventory in case of the main steam pipeline breaks, so that the rate of the primary coolant temperature reduction can be slowed down in order to prevent fuel element from burn-out. The venturi type flow limiter is analysed, its flow characteristics are delineated, physical and mathematical models defined; the detail mathematical derivation provided. The research lays down a theoretic basis for flow limiter design. The governing equations and formulas given can be directly applied to computer analysis of the flow limiter. (3 refs., 3 figs.)

  7. Mechanistic multidimensional analysis of horizontal two-phase flows

    International Nuclear Information System (INIS)

    Tselishcheva, Elena A.; Antal, Steven P.; Podowski, Michael Z.

    2010-01-01

    The purpose of this paper is to discuss the results of analysis of two-phase flow in horizontal tubes. Two flow situations have been considered: gas/liquid flow in a long straight pipe, and similar flow conditions in a pipe with 90 deg. elbow. The theoretical approach utilizes a multifield modeling concept. A complete three-dimensional two-phase flow model has been implemented in a state-of-the-art computational multiphase fluid dynamics (CMFD) computer code, NPHASE. The overall model has been tested parametrically. Also, the results of NPHASE simulations have been compared against experimental data for a pipe with 90 deg. elbow.

  8. Stereo Scene Flow for 3D Motion Analysis

    CERN Document Server

    Wedel, Andreas

    2011-01-01

    This book presents methods for estimating optical flow and scene flow motion with high accuracy, focusing on the practical application of these methods in camera-based driver assistance systems. Clearly and logically structured, the book builds from basic themes to more advanced concepts, culminating in the development of a novel, accurate and robust optic flow method. Features: reviews the major advances in motion estimation and motion analysis, and the latest progress of dense optical flow algorithms; investigates the use of residual images for optical flow; examines methods for deriving mot

  9. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  10. Application of PSAT to Load Flow Analysis with STATCOM under Load Increase Scenario and Line Contingencies

    Science.gov (United States)

    Telang, Aparna S.; Bedekar, P. P.

    2017-09-01

    Load flow analysis is the initial and essential step for any power system computation. It is required for choosing better options for power system expansion to meet with ever increasing load demand. Implementation of Flexible AC Transmission System (FACTS) device like STATCOM, in the load flow, which is having fast and very flexible control, is one of the important tasks for power system researchers. This paper presents a simple and systematic approach for steady state power flow calculations with FACTS controller, static synchronous compensator (STATCOM) using command line usage of MATLAB tool-power system analysis toolbox (PSAT). The complexity of MATLAB language programming increases due to incorporation of STATCOM in an existing Newton-Raphson load flow algorithm. Thus, the main contribution of this paper is to show how command line usage of user friendly MATLAB tool, PSAT, can extensively be used for quicker and wider interpretation of the results of load flow with STATCOM. The novelty of this paper lies in the method of applying the load increase pattern, where the active and reactive loads have been changed simultaneously at all the load buses under consideration for creating stressed conditions for load flow analysis with STATCOM. The performance have been evaluated on many standard IEEE test systems and the results for standard IEEE-30 bus system, IEEE-57 bus system, and IEEE-118 bus system are presented.

  11. ON THE ANALYSIS OF IMPEDANCE-DRIVEN REVERSE FLOW DYNAMICS

    Directory of Open Access Journals (Sweden)

    LEE V. C.-C.

    2017-02-01

    Full Text Available Impedance pump is a simple valve-less pumping mechanism, where an elastic tube is joined to a more rigid tube, at both ends. By inducing a periodic asymmetrical compression on the elastic tube will produce a unidirectional flow within the system. This pumping concept offers a low energy, low noise alternative, which makes it an effective driving mechanism, especially for micro-fluidic systems. In addition, the wave-based mechanism through which pumping occurs infers many benefits in terms of simplicity of design and manufacturing. Adjustment of simple parameters such as the excitation frequencies or compression locations will reverse the direction of flow, providing a very versatile range of flow outputs. This paper describes the experimental analysis of such impedance-driven flow with emphasis on the dynamical study of the reverse flow in open-loop environment. In this study, tapered section with converging steps is introduced at both ends of the elastic tube to amplify the magnitude of reverse flow. Study conducted shows that the reverse peak flow is rather significant with estimate of 23% lower than the forward peak flow. The flow dynamics on the other hand has shown to exhibit different characteristics as per the forward peak flow. The flow characteristics is then studied and showed that the tapered sections altered the impedance within the system and hence induce a higher flow in the reverse direction.

  12. Developing tools to link environmental flows science and its practice in Sri Lanka

    Directory of Open Access Journals (Sweden)

    N. Eriyagma

    2014-09-01

    Full Text Available The term "Environmental Flows (EF" may be defined as "the quantity, timing and quality of water flows required to sustain freshwater and estuarine ecosystems and the human livelihoods and well-being that depend on these ecosystems". It may be regarded as "water for nature" or "environmental demand" similar to crop water requirements, industrial or domestic water demand. The practice of EF is still limited to a few developed countries such as Australia, South Africa and the UK. In many developing countries EF is rarely considered in water resources planning and is often deemed "unimportant". Sri Lanka, being a developing country, is no exception to this general rule. Although the country underwent an extensive irrigation/water resources development phase during the 1960s through to the 1980s, the concept of EF was hardly considered. However, as Sri Lanka's water resources are being exploited more and more for human usage, ecologists, water practitioners and policymakers alike have realized the importance of EF in sustaining not only freshwater and estuarine ecosystems, but also their services to humans. Hence estimation of EF has been made mandatory in environmental impact assessments (EIAs of all large development projects involving river regulation/water abstraction. Considering EF is especially vital under the rapid urbanization and infrastructure development phase that dawned after the end of the war in the North and the East of the country in 2009. This paper details simple tools (including a software package which is under development and methods that may be used for coarse scale estimation of EF at/near monitored locations on major rivers of Sri Lanka, along with example applications to two locations on River Mahaweli. It is hoped that these tools will help bridge the gap between EF science and its practice in Sri Lanka and other developing countries.

  13. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  14. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  15. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  16. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  17. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  18. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  19. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  20. Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys

    Science.gov (United States)

    Grujicic, M.; Arakere, G.; Pandurangan, B.; Ochterbeck, J. M.; Yen, C.-F.; Cheeseman, B. A.; Reynolds, A. P.; Sutton, M. A.

    2012-09-01

    Workpiece material flow and stirring/mixing during the friction stir welding (FSW) process are investigated computationally. Within the numerical model of the FSW process, the FSW tool is treated as a Lagrangian component while the workpiece material is treated as an Eulerian component. The employed coupled Eulerian/Lagrangian computational analysis of the welding process was of a two-way thermo-mechanical character (i.e., frictional-sliding/plastic-work dissipation is taken to act as a heat source in the thermal-energy balance equation) while temperature is allowed to affect mechanical aspects of the model through temperature-dependent material properties. The workpiece material (AA5059, solid-solution strengthened and strain-hardened aluminum alloy) is represented using a modified version of the classical Johnson-Cook model (within which the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13 tool steel) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process parameters are investigated (e.g., weld pitch, tool tilt-angle, and the tool pin-size). The results pertaining to the material flow during FSW are compared with their experimental counterparts. It is found that, for the most part, experimentally observed material-flow characteristics are reproduced within the current FSW-process model.

  1. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  2. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  3. To Examine effect of Flow Zone Generation Techniques for Numerical Flow Analysis in Hydraulic Turbine

    International Nuclear Information System (INIS)

    Hussain, M.; Khan, J.A.

    2004-01-01

    A numerical study of flow in distributor of Francis Turbine is carried out by using two different techniques of flow zone generation. Distributor of GAMM Francis Turbine is used for present calculation. In present work, flow is assumed to be periodic around the distributor in steady state conditions, therefore computational domain consists of only one blade channel (one stay vane and one guide vane). The distributor computational domain is bounded up stream by cylindrical and downstream by conical patches. The first one corresponds to the spiral casing outflow section, while the second one is considered to be the distributor outlet or runner inlet. Upper and lower surfaces are generated by the revolution of hub and shroud edges. Single connected and multiple connected techniques are considered to generate distributor flow zone for numerical flow analysis of GAMM Francis turbine. The tetrahedral meshes are generated in both the flow zones. Same boundary conditions are applied for both the equivalent flow zones. The three dimensional, laminar flow analysis for both the distributor flow zones of the GAMM Francis turbine operating at the best efficiency point is performed. Gambit and G- Turbo are used as a preprocessor while calculations are done by using Fluent. Finally, numerical results obtained on the distributor outlet are compared with the available experimental data to validate the two different methodologies and examine their accuracy. (author)

  4. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    International Nuclear Information System (INIS)

    Clough, D; Fletcher, S; Longstaff, A P; Willoughby, P

    2012-01-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  5. LAMINAR STABILITY ANALYSIS IN BOUNDARY LAYER FLOW

    Directory of Open Access Journals (Sweden)

    Mihaela CALUDESCU

    2009-09-01

    Full Text Available This study presents a numerical study concerning the flow control by suction and injection. The case study is over a symmetrical airfoil with suction and injection slots. The angle of attack is 3 degree with the Mach number 0.12.

  6. LTE uplink scheduling - flow level analysis

    NARCIS (Netherlands)

    Dimitrova, D.C.; van den Berg, J.L.; Heijenk, G.; Litjens, R.; Sacchi, Claudio; Bellalta, Boris; Vinel, Alexey; Schlegel, Christian; Granelli, Fabrizio; Zhang, Yan

    Long Term Evolution (LTE) is a cellular technology foreseen to extend the capacity and improve the performance of current 3G cellular networks. A key mechanism in the LTE traffic handling is the packet scheduler, which is in charge of allocating resources to active flows in both the frequency and

  7. LTE uplink scheduling - Flow level analysis

    NARCIS (Netherlands)

    Dimitrova, D.C.; Berg, J.L. van den; Heijenk, G.; Litjens, R.

    2011-01-01

    Long Term Evolution (LTE) is a cellular technology foreseen to extend the capacity and improve the performance of current 3G cellular networks. A key mechanism in the LTE traffic handling is the packet scheduler, which is in charge of allocating resources to active flows in both the frequency and

  8. Migration Flows: Measurement, Analysis and Modeling

    NARCIS (Netherlands)

    Willekens, F.J.; White, Michael J.

    2016-01-01

    This chapter is an introduction to the study of migration flows. It starts with a review of major definition and measurement issues. Comparative studies of migration are particularly difficult because different countries define migration differently and measurement methods are not harmonized.

  9. Mathematical simulation of fluid flow and analysis of flow pattern in the flow path of low-head Kaplan turbine

    Directory of Open Access Journals (Sweden)

    A. V. Rusanov

    2016-12-01

    Full Text Available The results of numerical investigation of spatial flow of viscous incompressible fluid in flow part of Kaplan turbine PL20 Kremenchug HPP at optimum setting angle of runner blade φb = 15° and at maximum setting angle φb = 35° are shown. The flow simulation has been carried out on basis of numerical integration of the Reynolds equations with an additional term containing artificial compressibility. The differential two-parameter model of Menter (SST has been applied to take into account turbulent effects. Numerical integration of the equations is carried out using an implicit quasi-monotone Godunov type scheme of second - order accuracy in space and time. The calculations have been conducted with the help of the software system IPMFlow. The analysis of fluid flow in the flow part elements is shown and the values of hydraulic losses and local cavitation coefficient have been obtained. Comparison of calculated and experimental results has been carried out.

  10. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  11. Porcupine: A visual pipeline tool for neuroimaging analysis.

    Directory of Open Access Journals (Sweden)

    Tim van Mourik

    2018-05-01

    Full Text Available The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one's analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one's analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0.

  12. Integrated Design and Analysis Tools for Reduced Weight, Affordable Fiber Steered Composites

    National Research Council Canada - National Science Library

    Hale, Richard

    2004-01-01

    This report describes geometric design tools which encompass the overall process flow for fiber placed and fiber steered structures, to allow parallel considerations for manufacturability and mechanical performance...

  13. Modeling of the flow stress for AISI H13 Tool Steel during Hard Machining Processes

    Science.gov (United States)

    Umbrello, Domenico; Rizzuti, Stefania; Outeiro, José C.; Shivpuri, Rajiv

    2007-04-01

    In general, the flow stress models used in computer simulation of machining processes are a function of effective strain, effective strain rate and temperature developed during the cutting process. However, these models do not adequately describe the material behavior in hard machining, where a range of material hardness between 45 and 60 HRC are used. Thus, depending on the specific material hardness different material models must be used in modeling the cutting process. This paper describes the development of a hardness-based flow stress and fracture models for the AISI H13 tool steel, which can be applied for range of material hardness mentioned above. These models were implemented in a non-isothermal viscoplastic numerical model to simulate the machining process for AISI H13 with various hardness values and applying different cutting regime parameters. Predicted results are validated by comparing them with experimental results found in the literature. They are found to predict reasonably well the cutting forces as well as the change in chip morphology from continuous to segmented chip as the material hardness change.

  14. Modeling of the flow stress for AISI H13 Tool Steel during Hard Machining Processes

    International Nuclear Information System (INIS)

    Umbrello, Domenico; Rizzuti, Stefania; Outeiro, Jose C.; Shivpuri, Rajiv

    2007-01-01

    In general, the flow stress models used in computer simulation of machining processes are a function of effective strain, effective strain rate and temperature developed during the cutting process. However, these models do not adequately describe the material behavior in hard machining, where a range of material hardness between 45 and 60 HRC are used. Thus, depending on the specific material hardness different material models must be used in modeling the cutting process. This paper describes the development of a hardness-based flow stress and fracture models for the AISI H13 tool steel, which can be applied for range of material hardness mentioned above. These models were implemented in a non-isothermal viscoplastic numerical model to simulate the machining process for AISI H13 with various hardness values and applying different cutting regime parameters. Predicted results are validated by comparing them with experimental results found in the literature. They are found to predict reasonably well the cutting forces as well as the change in chip morphology from continuous to segmented chip as the material hardness change

  15. Acetylene Flow Rate as a Crucial Parameter of Vacuum Carburizing Process of Modern Tool Steels

    Directory of Open Access Journals (Sweden)

    Rokicki P.

    2016-12-01

    Full Text Available Carburizing is one of the most popular and wide used thermo-chemical treatment methods of surface modification of tool steels. It is a process based on carbon diffusive enrichment of the surface material and is applied for elements that are supposed to present higher hardness and wear resistance sustaining core ductility. Typical elements submitted to carburizing process are gears, shafts, pins and bearing elements. In the last years, more and more popular, especially in highly advanced treatment procedures used in the aerospace industry is vacuum carburizing. It is a process based on chemical treatment of the surface in lower pressure, providing much higher uniformity of carburized layer, lower process cost and much lesser negative impact on environment to compare with conventional carburizing methods, as for example gas carburizing in Endo atmosphere. Unfortunately, aerospace industry requires much more detailed description of the phenomena linked to this process method and the literature background shows lack of tests that could confirm fulfilment of all needed requirements and to understand the process itself in much deeper meaning. In the presented paper, authors focused their research on acetylene flow impact on carburized layer characteristic. This is one of the most crucial parameters concerning homogeneity and uniformity of carburized layer properties. That is why, specific process methodology have been planned based on different acetylene flow values, and the surface layer of the steel gears have been investigated in meaning to impact on any possible change in potential properties of the final product.

  16. multiUQ: An intrusive uncertainty quantification tool for gas-liquid multiphase flows

    Science.gov (United States)

    Turnquist, Brian; Owkes, Mark

    2017-11-01

    Uncertainty quantification (UQ) can improve our understanding of the sensitivity of gas-liquid multiphase flows to variability about inflow conditions and fluid properties, creating a valuable tool for engineers. While non-intrusive UQ methods (e.g., Monte Carlo) are simple and robust, the cost associated with these techniques can render them unrealistic. In contrast, intrusive UQ techniques modify the governing equations by replacing deterministic variables with stochastic variables, adding complexity, but making UQ cost effective. Our numerical framework, called multiUQ, introduces an intrusive UQ approach for gas-liquid flows, leveraging a polynomial chaos expansion of the stochastic variables: density, momentum, pressure, viscosity, and surface tension. The gas-liquid interface is captured using a conservative level set approach, including a modified reinitialization equation which is robust and quadrature free. A least-squares method is leveraged to compute the stochastic interface normal and curvature needed in the continuum surface force method for surface tension. The solver is tested by applying uncertainty to one or two variables and verifying results against the Monte Carlo approach. NSF Grant #1511325.

  17. Towards an integrated petrophysical tool for multiphase flow properties of core samples

    Energy Technology Data Exchange (ETDEWEB)

    Lenormand, R. [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    This paper describes the first use of an Integrated Petrophysical Tool (IPT) on reservoir rock samples. The IPT simultaneously measures the following petrophysical properties: (1) Complete capillary pressure cycle: primary drainage, spontaneous and forced imbibitions, secondary drainage (the cycle leads to the wettability of the core by using the USBM index); End-points and parts of the relative permeability curves; Formation factor and resistivity index. The IPT is based on the steady-state injection of one fluid through the sample placed in a Hassler cell. The experiment leading to the whole Pc cycle on two reservoir sandstones consists of about 30 steps at various oil or water flow rates. It takes about four weeks and is operated at room conditions. Relative permeabilities are in line with standard steady-state measurements. Capillary pressures are in accordance with standard centrifuge measurements. There is no comparison for the resistivity index, but the results are in agreement with literature data. However, the accurate determination of saturation remains the main difficulty and some improvements are proposed. In conclusion, the Integrated Petrophysical Tool is as accurate as standard methods and has the advantage of providing the various parameters on the same sample and during a single experiment. The FIT is easy to use and can be automated. In addition, it can be operated in reservoir conditions.

  18. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  19. Thermohydrodynamic analysis of cryogenic liquid turbulent flow fluid film bearings

    Science.gov (United States)

    Andres, Luis San

    1993-01-01

    A thermohydrodynamic analysis is presented and a computer code developed for prediction of the static and dynamic force response of hydrostatic journal bearings (HJB's), annular seals or damper bearing seals, and fixed arc pad bearings for cryogenic liquid applications. The study includes the most important flow characteristics found in cryogenic fluid film bearings such as flow turbulence, fluid inertia, liquid compressibility and thermal effects. The analysis and computational model devised allow the determination of the flow field in cryogenic fluid film bearings along with the dynamic force coefficients for rotor-bearing stability analysis.

  20. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  1. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  2. Campaign effects and self-analysis Internet tool

    Energy Technology Data Exchange (ETDEWEB)

    Brange, Birgitte [Danish Electricity Saving Trust (Denmark); Fjordbak Larsen, Troels [IT Energy ApS (Denmark); Wilke, Goeran [Danish Electricity Saving Trust (Denmark)

    2007-07-01

    In October 2006, the Danish Electricity Saving Trust launched a large TV campaign targeting domestic electricity consumption. The campaign was based on the central message '1000 kWh/year per person is enough'. The campaign was accompanied by a new internet portal with updated information about numerous household appliances, and by analysis tools for bringing down electricity consumption to 1000 kWh/year per person. The effects of the campaign are monitored through repeated surveys and analysed in relation to usage of internet tools.

  3. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  4. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...

  5. CFD Analysis for Predicting Flow Resistance of the Cross Flow Gap in Prismatic VHTR Core

    International Nuclear Information System (INIS)

    Lee, Jeong Hun; Yoon, Su Jong; Park, Goon Cherl; Park, Jong Woon

    2011-01-01

    The core of Very High Temperature Reactor (VHTR) consists of assemblies of hexagonal graphite blocks and its height and across-flats width are 800 mm and 360 mm respectively. They are equipped with 108 coolant holes 16 mm in diameter. Up to ten fuel blocks arranged in vertical order form a fuel element column and the neutron flux varies over the cross section of the core. It makes different axial shrinkage of fuel element and this leads to make wedge-shaped gaps between the base and top surfaces of stacked blocks. The cross flow is defined as the core flow that passes through this cross gaps. The cross flow complicates the flow distribution of reactor core. Moreover, the cross flow could lead to uneven coolant distribution and consequently to superheating of individual fuel element zones with increased fission product release. Since the core cross flow has a negative impact on safety and efficiency of VHTR, core cross flow phenomena have to be investigated to improve the core thermal margin of VHTR. In particular, to predict amount of flow at the cross flow gap obtaining accurate flow loss coefficient is important. Nevertheless, there has not been much effort in domestic. The experiment of cross flow was carried out by H. G. Groehn in 1981 Germany. For the study of cross flow the applicability of CFD code should be validated. In this paper a commercial CFD code CFX-12 validation will be carried out with this cross flow experiment. Validated data can be used for validation of other thermal-hydraulic analysis codes

  6. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  7. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    Science.gov (United States)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  8. Application of effective discharge analysis to environmental flow decision-making

    Science.gov (United States)

    McKay, S. Kyle; Freeman, Mary C.; Covich, A.P.

    2016-01-01

    Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.

  9. Substance Flow Analysis of Mercury in China

    Science.gov (United States)

    Hui, L. M.; Wang, S.; Zhang, L.; Wang, F. Y.; Wu, Q. R.

    2015-12-01

    In previous studies, the emission of anthropogenic atmospheric Hg in China as well as single sector have been examined a lot. However, there might have been more Hg released as solid wastes rather than air. Hg stored in solid wastes may be released to air again when the solid wastes experience high temperature process or cause local pollution if the solid wastes are stacked casually for a long time. To trace the fate of Hg in China, this study developed the substance flow of Hg in 2010 covering all the sectors summarized in table 1. Below showed in Figure 1, the total Hg input is 2825t. The unintentional input of Hg, mined Hg, and recycled Hg account for 57%, 32% and 11% respectively. Figure 2 provides the detail information of substance flow of Hg. Byproducts from one sector may be used as raw materials of another, causing cross Hg flow between sectors. The Hg input of cement production is 303 t, of which 34% comes from coal and limestone, 33% comes from non-ferrous smelting, 23% comes from coal combustion, 7% comes from iron and steel production and 3% comes from mercury mining. Hg flowing to recycledHg production is 639 t, mainly from Hg contained in waste active carbon and mercuric chloride catalyst from VCM production and acid sludge from non-ferrous smelting. There are 20 t mercury flowing from spent mercury adding products to incineration. Figure1 and Figure 2 also show that 46% of the output Hg belongs to "Lagged release", which means this part of mercury might be released later. The "Lagged release" Hg includes 809 t Hg contained in stacked byproducts form coal combustion, non-ferrous smelting, iron and steel production, Al production, cement production and mercury mining, 161t Hg stored in the pipeline of VCM producing, 10 t Hg in fluorescent lamps that are in use and 314 t mercury stored in materials waiting to be handled with in recycled mercury plants. There is 112 t Hg stored in landfill and 129 t Hg exported abroad with the export of mercury adding

  10. Parallel Factor Analysis as an exploratory tool for wavelet transformed event-related EEG

    DEFF Research Database (Denmark)

    Mørup, Morten; Hansen, Lars Kai; Hermann, Cristoph S.

    2006-01-01

    by the inter-trial phase coherence (ITPC) encompassing ANOVA analysis of differences between conditions and 5-way analysis of channel x frequency x time x subject x condition. A flow chart is presented on how to perform data exploration using the PARAFAC decomposition on multi-way arrays. This includes (A......) channel x frequency x time 3-way arrays of F test values from a repeated measures analysis of variance (ANOVA) between two stimulus conditions; (B) subject-specific 3-way analyses; and (C) an overall 5-way analysis of channel x frequency x time x subject x condition. The PARAFAC decompositions were able...... of the 3-way array of ANOVA F test values clearly showed the difference of regions of interest across modalities, while the 5-way analysis enabled visualization of both quantitative and qualitative differences. Consequently, PARAFAC is a promising data exploratory tool in the analysis of the wavelets...

  11. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  12. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  13. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  14. Computational Analysis of Multi-Rotor Flows

    Science.gov (United States)

    Yoon, Seokkwan; Lee, Henry C.; Pulliam, Thomas H.

    2016-01-01

    Interactional aerodynamics of multi-rotor flows has been studied for a quadcopter representing a generic quad tilt-rotor aircraft in hover. The objective of the present study is to investigate the effects of the separation distances between rotors, and also fuselage and wings on the performance and efficiency of multirotor systems. Three-dimensional unsteady Navier-Stokes equations are solved using a spatially 5th order accurate scheme, dual-time stepping, and the Detached Eddy Simulation turbulence model. The results show that the separation distances as well as the wings have significant effects on the vertical forces of quadroror systems in hover. Understanding interactions in multi-rotor flows would help improve the design of next generation multi-rotor drones.

  15. A spatial analysis of China's coal flow

    International Nuclear Information System (INIS)

    Mou Dunguo; Li Zhi

    2012-01-01

    The characteristics of China's energy structure and the distribution of its coal resources make coal transportation a very important component of the energy system; moreover, coal transportation acts as a bottleneck for the Chinese economy. To insure the security of the coal supply, China has begun to build regional strategic coal reserves at some locations, but transportation is still the fundamental way to guaranty supply security. Here, we study China's coal transportation quantitatively with a linear programming method that analyses the direction and volume of China's coal flows with the prerequisite that each province's supply and demand balance is guaranteed. First, we analyse the optimal coal transportation for the status quo coal supply and demand given the bottleneck effects that the Daqin Railway has on China's coal flow; second, we analyse the influence of future shifts in the coal supply zone in the future, finding that China's coal flows will also change, which will pressure China to construct railways and ports; and finally, we analyse the possibility of exploiting Yangtze River capacity for coal transportation. We conclude the paper with suggestions for enhancing China's coal transportation security. - Highlights: ► We use linear programming to study China's coal transportation. ► First, analyse the optimal coal flow under the status quo condition. ► Second, analyse influences of coal supply zone shifts to Neimeng and Xinjiang. ► Third, analyse the influence of using Yangtze River for coal transportation. ► At last, we give suggestions about infrastructure construction to guaranty China's long-run coal supply security.

  16. State space analysis of minimal channel flow

    Energy Technology Data Exchange (ETDEWEB)

    Neelavara, Shreyas Acharya; Duguet, Yohann; Lusseyran, François, E-mail: acharya@limsi.fr [LIMSI-CNRS, Campus Universitaire d’Orsay, Université Paris-Saclay, F-91405 Orsay (France)

    2017-06-15

    Turbulence and edge states are investigated numerically in a plane Poiseuille flow driven by a fixed pressure gradient. Simulations are carried out within the minimal flow unit, a concept introduced by Jiménez and Moin (1991 J . Fluid Mech. 225 213–40) to unravel the dynamics of near-wall structures in the absence of outer large-scale motions. For both turbulent and edge regimes the activity appears to be localised near only one wall at a time, and the long term dynamics features abrupt reversals. The dynamics along one reversal is structured around the transient visit to a subspace of symmetric flow fields. An exact travelling wave solution is found to exist very close to this subspace. Additionally the self-similarity of the asymmetric states is addressed. Contrary to most studies focusing on symmetric solutions, the present study suggests that edge states, when localised near one wall, do not scale in outer units. The current study suggests a composite scaling. (paper)

  17. Analysis of flow coefficient in chair manufacture

    Directory of Open Access Journals (Sweden)

    Ivković Dragoljub

    2005-01-01

    Full Text Available The delivery on time is not possible without the good-quality planning of deadlines, i.e. planning of the manufacturing process duration. The study of flow coefficient enables the realistic forecasting of the manufacturing process duration. This paper points to the significance of the study of flow coefficient on scientific basis so as to determine the terms of the end of the manufacture of chairs made of sawn timber. Chairs are the products of complex construction, often almost completely made of sawn timber as the basic material. They belong to the group of export products, so it is especially significant to analyze the duration of the production cycle, and the type and the degree of stoppages in this type of production. Parallel method of production is applied in chair manufacture. The study shows that the value of flow coefficient is close to one or higher, in most cases. The results indicate that the percentage of interoperational stoppage is unjustifiably high, so it is proposed how to decrease the percentage of stoppages in the manufacturing process.

  18. Status of CONRAD, a nuclear reaction analysis tool

    International Nuclear Information System (INIS)

    Saint Jean, C. de; Habert, B.; Litaize, O.; Noguere, G.; Suteau, C.

    2008-01-01

    The development of a software tool (CONRAD) was initiated at CEA/Cadarache to give answers to various problems arising in the data analysis of nuclear reactions. This tool is then characterized by the handling of uncertainties from experimental values to covariance matrices for multi-group cross sections. An object oriented design was chosen allowing an easy interface with graphical tool for input/output data and being a natural framework for innovative nuclear models (Fission). The major achieved developments are a data model for describing channels, nuclear reactions, nuclear models and processes with interface to classical data formats, theoretical calculations for the resolved resonance range (Reich-Moore) and unresolved resonance range (Hauser-Feshbach, Gilbert-Cameron,...) with nuclear model parameters adjustment on experimental data sets and a Monte Carlo method based on conditional probabilities developed to calculate properly covariance matrices. The on-going developments deal with the experimental data description (covariance matrices) and the graphical user interface. (authors)

  19. Flow injection analysis in inductively coupled plasma spectrometry

    International Nuclear Information System (INIS)

    Rosias, Maria F.G.G.

    1995-10-01

    The main features of flow injection analysis (FIA) as contribution to the inductively coupled plasma (Icp) spectrometry are described. A systematic review of researches using the combined FIA-Icp and the benefits of this association are presented. Flow systems were proposed to perform on-line Icp solution management for multielemental determination by atomic emission spectrometry (Icp-AES) or mass spectrometry. The inclusion of on-line ion exchangers in flow systems for matrix separation and/or analyte preconcentration are presented. Together with those applications the new advent of instruments with facilities for multielement detection on flow injection signals are described. (author). 75 refs., 19 figs

  20. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  1. Microscopy image segmentation tool: Robust image data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  2. Microscopy image segmentation tool: Robust image data analysis

    Science.gov (United States)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  3. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  4. Development and Validation of A Nuclear Fuel Cycle Analysis Tool: A FUTURE Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. K.; Ko, W. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, Yoon Hee [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2013-10-15

    This paper presents the development and validation methods of the FUTURE (FUel cycle analysis Tool for nUcleaR Energy) code, which was developed for a dynamic material flow evaluation and economic analysis of the nuclear fuel cycle. This code enables an evaluation of a nuclear material flow and its economy for diverse nuclear fuel cycles based on a predictable scenario. The most notable virtue of this FUTURE code, which was developed using C and MICROSOFT SQL DBMS, is that a program user can design a nuclear fuel cycle process easily using a standard process on the canvas screen through a drag-and-drop method. From the user's point of view, this code is very easy to use thanks to its high flexibility. In addition, the new code also enables the maintenance of data integrity by constructing a database environment of the results of the nuclear fuel cycle analyses.

  5. DEVELOPMENT AND VALIDATION OF A NUCLEAR FUEL CYCLE ANALYSIS TOOL: A FUTURE CODE

    Directory of Open Access Journals (Sweden)

    S.K. KIM

    2013-10-01

    Full Text Available This paper presents the development and validation methods of the FUTURE (FUel cycle analysis Tool for nUcleaR Energy code, which was developed for a dynamic material flow evaluation and economic analysis of the nuclear fuel cycle. This code enables an evaluation of a nuclear material flow and its economy for diverse nuclear fuel cycles based on a predictable scenario. The most notable virtue of this FUTURE code, which was developed using C# and MICROSOFT SQL DBMS, is that a program user can design a nuclear fuel cycle process easily using a standard process on the canvas screen through a drag-and-drop method. From the user's point of view, this code is very easy to use thanks to its high flexibility. In addition, the new code also enables the maintenance of data integrity by constructing a database environment of the results of the nuclear fuel cycle analyses.

  6. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  7. Visual Analysis of Inclusion Dynamics in Two-Phase Flow.

    Science.gov (United States)

    Karch, Grzegorz Karol; Beck, Fabian; Ertl, Moritz; Meister, Christian; Schulte, Kathrin; Weigand, Bernhard; Ertl, Thomas; Sadlo, Filip

    2018-05-01

    In single-phase flow visualization, research focuses on the analysis of vector field properties. In two-phase flow, in contrast, analysis of the phase components is typically of major interest. So far, visualization research of two-phase flow concentrated on proper interface reconstruction and the analysis thereof. In this paper, we present a novel visualization technique that enables the investigation of complex two-phase flow phenomena with respect to the physics of breakup and coalescence of inclusions. On the one hand, we adapt dimensionless quantities for a localized analysis of phase instability and breakup, and provide detailed inspection of breakup dynamics with emphasis on oscillation and its interplay with rotational motion. On the other hand, we present a parametric tightly linked space-time visualization approach for an effective interactive representation of the overall dynamics. We demonstrate the utility of our approach using several two-phase CFD datasets.

  8. Spaceborne Differential SAR Interferometry: Data Analysis Tools for Deformation Measurement

    Directory of Open Access Journals (Sweden)

    Michele Crosetto

    2011-02-01

    Full Text Available This paper is focused on spaceborne Differential Interferometric SAR (DInSAR for land deformation measurement and monitoring. In the last two decades several DInSAR data analysis procedures have been proposed. The objective of this paper is to describe the DInSAR data processing and analysis tools developed at the Institute of Geomatics in almost ten years of research activities. Four main DInSAR analysis procedures are described, which range from the standard DInSAR analysis based on a single interferogram to more advanced Persistent Scatterer Interferometry (PSI approaches. These different procedures guarantee a sufficient flexibility in DInSAR data processing. In order to provide a technical insight into these analysis procedures, a whole section discusses their main data processing and analysis steps, especially those needed in PSI analyses. A specific section is devoted to the core of our PSI analysis tools: the so-called 2+1D phase unwrapping procedure, which couples a 2D phase unwrapping, performed interferogram-wise, with a kind of 1D phase unwrapping along time, performed pixel-wise. In the last part of the paper, some examples of DInSAR results are discussed, which were derived by standard DInSAR or PSI analyses. Most of these results were derived from X-band SAR data coming from the TerraSAR-X and CosmoSkyMed sensors.

  9. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  10. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    Science.gov (United States)

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  11. The cash-flow analysis of the firm

    OpenAIRE

    Mariana Man

    2001-01-01

    The analysis of economic and financial indicators of the firm regards the profit and loss account analysis and the balance sheet analysis. The cash-flow from operating activities represents the amount of cash obtained by a firm from selling goods and services after deducting the costs involved by raw materials, materials and processenig operations

  12. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  13. Numerical Analysis of Dusty-Gas Flows

    Science.gov (United States)

    Saito, T.

    2002-02-01

    This paper presents the development of a numerical code for simulating unsteady dusty-gas flows including shock and rarefaction waves. The numerical results obtained for a shock tube problem are used for validating the accuracy and performance of the code. The code is then extended for simulating two-dimensional problems. Since the interactions between the gas and particle phases are calculated with the operator splitting technique, we can choose numerical schemes independently for the different phases. A semi-analytical method is developed for the dust phase, while the TVD scheme of Harten and Yee is chosen for the gas phase. Throughout this study, computations are carried out on SGI Origin2000, a parallel computer with multiple of RISC based processors. The efficient use of the parallel computer system is an important issue and the code implementation on Origin2000 is also described. Flow profiles of both the gas and solid particles behind the steady shock wave are calculated by integrating the steady conservation equations. The good agreement between the pseudo-stationary solutions and those from the current numerical code validates the numerical approach and the actual coding. The pseudo-stationary shock profiles can also be used as initial conditions of unsteady multidimensional simulations.

  14. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    Science.gov (United States)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  15. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    International Nuclear Information System (INIS)

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; Garcia-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; Da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document

  16. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  17. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  18. CyNC - a method for Real Time Analysis of Systems with Cyclic Data Flows

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Nielsen, Jens F. Dalsgaard; Larsen, Kim Guldstrand

    2005-01-01

    The paper addresses a novel method for realtime analysis of systems with cyclic data flows. The presented method is based on Network Calculus principles, where upper and lower flow and service constraint are used to bound data flows and processing resources. In acyclic systems flow constraints ma...... in a prototype tool also denoted CyNC providing a graphical user interface for model specification based on the MATLAB/SimuLink framework....... in a space of constraint functions. In this paper a method denoted CyNC for obtaining a well defined solution to that problem is presented along with a theoretical justification of the method as well as comparative results for CyNC and alternative methods on a relevant example. The method is implemented...

  19. Analysis and Prediction of Micromilling Stability with Variable Tool Geometry

    Directory of Open Access Journals (Sweden)

    Ziyang Cao

    2014-11-01

    Full Text Available Micromilling can fabricate miniaturized components using micro-end mill at high rotational speeds. The analysis of machining stability in micromilling plays an important role in characterizing the cutting process, estimating the tool life, and optimizing the process. A numerical analysis and experimental method are presented to investigate the chatter stability in micro-end milling process with variable milling tool geometry. The schematic model of micromilling process is constructed and the calculation formula to predict cutting force and displacements is derived. This is followed by a detailed numerical analysis on micromilling forces between helical ball and square end mills through time domain and frequency domain method and the results are compared. Furthermore, a detailed time domain simulation for micro end milling with straight teeth and helical teeth end mill is conducted based on the machine-tool system frequency response function obtained through modal experiment. The forces and displacements are predicted and the simulation result between variable cutter geometry is deeply compared. The simulation results have important significance for the actual milling process.

  20. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  1. Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling

    Science.gov (United States)

    Liu, D.; Guo, S.; Lian, Y.

    2014-12-01

    Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.

  2. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  3. Theory, methods and tools for determining environmental flows for riparian vegetation: Riparian vegetation-flow response guilds

    Science.gov (United States)

    Merritt, D.M.; Scott, M.L.; Leroy, Poff N.; Auble, G.T.; Lytle, D.A.

    2010-01-01

    Riparian vegetation composition, structure and abundance are governed to a large degree by river flow regime and flow-mediated fluvial processes. Streamflow regime exerts selective pressures on riparian vegetation, resulting in adaptations (trait syndromes) to specific flow attributes. Widespread modification of flow regimes by humans has resulted in extensive alteration of riparian vegetation communities. Some of the negative effects of altered flow regimes on vegetation may be reversed by restoring components of the natural flow regime. 2. Models have been developed that quantitatively relate components of the flow regime to attributes of riparian vegetation at the individual, population and community levels. Predictive models range from simple statistical relationships, to more complex stochastic matrix population models and dynamic simulation models. Of the dozens of predictive models reviewed here, most treat one or a few species, have many simplifying assumptions such as stable channel form, and do not specify the time-scale of response. In many cases, these models are very effective in developing alternative streamflow management plans for specific river reaches or segments but are not directly transferable to other rivers or other regions. 3. A primary goal in riparian ecology is to develop general frameworks for prediction of vegetation response to changing environmental conditions. The development of riparian vegetation-flow response guilds offers a framework for transferring information from rivers where flow standards have been developed to maintain desirable vegetation attributes, to rivers with little or no existing information. 4. We propose to organise riparian plants into non-phylogenetic groupings of species with shared traits that are related to components of hydrologic regime: life history, reproductive strategy, morphology, adaptations to fluvial disturbance and adaptations to water availability. Plants from any river or region may be grouped

  4. Linear stability analysis of laminar flow near a stagnation point in the slip flow regime

    Science.gov (United States)

    Essaghir, E.; Oubarra, A.; Lahjomri, J.

    2017-12-01

    The aim of the present contribution is to analyze the effect of slip parameter on the stability of a laminar incompressible flow near a stagnation point in the slip flow regime. The analysis is based on the traditional normal mode approach and assumes parallel flow approximation. The Orr-Sommerfeld equation that governs the infinitesimal disturbance of stream function imposed to the steady main flow, which is an exact solution of the Navier-Stokes equation satisfying slip boundary conditions, is obtained by using the powerful spectral Chebyshev collocation method. The results of the effect of slip parameter K on the hydrodynamic characteristics of the base flow, namely the velocity profile, the shear stress profile, the boundary layer, displacement and momentum thicknesses are illustrated and discussed. The numerical data for these characteristics, as well as those of the eigenvalues and the corresponding wave numbers recover the results of the special case of no-slip boundary conditions. They are found to be in good agreement with previous numerical calculations. The effects of slip parameter on the neutral curves of stability, for two-dimensional disturbances in the Reynolds-wave number plane, are then obtained for the first time in the slip flow regime for stagnation point flow. Furthermore, the evolution of the critical Reynolds number against the slip parameter is established. The results show that the critical Reynolds number for instability is significantly increased with the slip parameter and the flow turn out to be more stable when the effect of rarefaction becomes important.

  5. Dynamic MLD analysis with flow graphs

    International Nuclear Information System (INIS)

    Jenab, K.; Sarfaraz, A.; Dhillon, B.S.; Seyed Hosseini, S.M.

    2012-01-01

    Master Logic Diagram (MLD) depicts the interrelationships among the independent functions and dependent support functions. Using MLD, the manner in which all functions, sub-functions interact to achieve the overall system objective can be investigated. This paper reports a probabilistic model to analyze an MLD by translating the interrelationships to a graph model. The proposed model uses the flow-graph concept and Moment Generating Function (MGF) to analyze the dependency matrix representing the MLD with embedded self-healing function/sub-functions. The functions/sub-functions are featured by failure detection and recovery mechanisms. The newly developed model provides the probability of the system failure, and system mean and standard deviation time to failure in the MLD. An illustrative example is demonstrated to present the application of the model.

  6. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  7. Federal metering data analysis needs and existing tools

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  8. Hybrid Information Flow Analysis for Programs with Arrays

    Directory of Open Access Journals (Sweden)

    Gergö Barany

    2016-07-01

    Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.

  9. Riparian trees as common denominators across the river flow spectrum: are ecophysiological methods useful tools in environmental flow assessments?

    CSIR Research Space (South Africa)

    Schachtschneider, K

    2014-04-01

    Full Text Available physiological differences for trees occurred along rivers of the drier flow regime spectrum (seasonal and ephemeral). As such, this physiological measurement may be a valuable indicator for water stress, while the other measurements might provide more conclusive...

  10. ADVANCED AND RAPID DEVELOPMENT OF DYNAMIC ANALYSIS TOOLS FOR JAVA

    Directory of Open Access Journals (Sweden)

    Alex Villazón

    2012-01-01

    Full Text Available Low-level bytecode instrumentation techniques are widely used in many software-engineering tools for the Java Virtual Machine (JVM, that perform some form of dynamic program analysis, such as profilers or debuggers. While program manipulation at the bytecode level is very flexible, because the possible bytecode transformations are not restricted, tool development based on this technique is tedious and error-prone. As a promising alternative, the specification of bytecode instrumentation at a higher level using aspect-oriented programming (AOP can reduce tool development time and cost. Unfortunately, prevailing AOP frameworks lack some features that are essential for certain dynamic analyses. In this article, we focus on three common shortcomings in AOP frameworks with respect to the development of aspect-based tools - (1 the lack of mechanisms for passing data between woven advices in local variables, (2 the support for user-defined static analyses at weaving time, and (3 the absence of pointcuts at the level of individual basic blocks of code. We propose @J, an annotation-based AOP language and weaver that integrates support for these three features. The benefits of the proposed features are illustrated with concrete examples.

  11. Modeling and simulation of the fluid flow in wire electrochemical machining with rotating tool (wire ECM)

    Science.gov (United States)

    Klocke, F.; Herrig, T.; Zeis, M.; Klink, A.

    2017-10-01

    Combining the working principle of electrochemical machining (ECM) with a universal rotating tool, like a wire, could manage lots of challenges of the classical ECM sinking process. Such a wire-ECM process could be able to machine flexible and efficient 2.5-dimensional geometries like fir tree slots in turbine discs. Nowadays, established manufacturing technologies for slotting turbine discs are broaching and wire electrical discharge machining (wire EDM). Nevertheless, high requirements on surface integrity of turbine parts need cost intensive process development and - in case of wire-EDM - trim cuts to reduce the heat affected rim zone. Due to the process specific advantages, ECM is an attractive alternative manufacturing technology and is getting more and more relevant for sinking applications within the last few years. But ECM is also opposed with high costs for process development and complex electrolyte flow devices. In the past, few studies dealt with the development of a wire ECM process to meet these challenges. However, previous concepts of wire ECM were only suitable for micro machining applications. Due to insufficient flushing concepts the application of the process for machining macro geometries failed. Therefore, this paper presents the modeling and simulation of a new flushing approach for process assessment. The suitability of a rotating structured wire electrode in combination with an axial flushing for electrodes with high aspect ratios is investigated and discussed.

  12. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    International Nuclear Information System (INIS)

    Plott, B.

    2006-01-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  13. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    Energy Technology Data Exchange (ETDEWEB)

    Plott, B. [Alion Science and Technology, MA and D Operation, 4949 Pearl E. Circle, 300, Boulder, CO 80301 (United States)

    2006-07-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  14. Sensitivity analysis of time-dependent laminar flows

    International Nuclear Information System (INIS)

    Hristova, H.; Etienne, S.; Pelletier, D.; Borggaard, J.

    2004-01-01

    This paper presents a general sensitivity equation method (SEM) for time dependent incompressible laminar flows. The SEM accounts for complex parameter dependence and is suitable for a wide range of problems. The formulation is verified on a problem with a closed form solution obtained by the method of manufactured solution. Systematic grid convergence studies confirm the theoretical rates of convergence in both space and time. The methodology is then applied to pulsatile flow around a square cylinder. Computations show that the flow starts with symmetrical vortex shedding followed by a transition to the traditional Von Karman street (alternate vortex shedding). Simulations show that the transition phase manifests itself earlier in the sensitivity fields than in the flow field itself. Sensitivities are then demonstrated for fast evaluation of nearby flows and uncertainty analysis. (author)

  15. Uncertainty analysis of power monitoring transit time ultrasonic flow meters

    International Nuclear Information System (INIS)

    Orosz, A.; Miller, D. W.; Christensen, R. N.; Arndt, S.

    2006-01-01

    A general uncertainty analysis is applied to chordal, transit time ultrasonic flow meters that are used in nuclear power plant feedwater loops. This investigation focuses on relationships between the major parameters of the flow measurement. For this study, mass flow rate is divided into three components, profile factor, density, and a form of volumetric flow rate. All system parameters are used to calculate values for these three components. Uncertainty is analyzed using a perturbation method. Sensitivity coefficients for major system parameters are shown, and these coefficients are applicable to a range of ultrasonic flow meters used in similar applications. Also shown is the uncertainty to be expected for density along with its relationship to other system uncertainties. One other conclusion is that pipe diameter sensitivity coefficients may be a function of the calibration technique used. (authors)

  16. The Montaguto earth flow: nine years of observation and analysis

    Science.gov (United States)

    Guerriero, L.; Revellino, R; Grelle, G.; Diodato, N; Guadagno, F.M.; Coe, Jeffrey A.

    2016-01-01

    This paper summarizes the methods, results, and interpretation of analyses carried out between 2006 and 2015 at the Montaguto earth flow in southern Italy. We conducted a multi-temporal analysis of earth-flow activity to reconstruct the morphological and structural evolution of the flow. Data from field mapping were combined with a geometric reconstruction of the basal slip surface in order to investigate relations between basal-slip surface geometry and deformation styles of earth-flow material. Moreover, we reconstructed the long-term pattern of earth-flow movement using both historical observations and modeled hydrologic and climatic data. Hydrologic and climatic data were used to develop a Landslide Hydrological Climatological (LHC) indicator model.

  17. Space shuttle booster multi-engine base flow analysis

    Science.gov (United States)

    Tang, H. H.; Gardiner, C. R.; Anderson, W. A.; Navickas, J.

    1972-01-01

    A comprehensive review of currently available techniques pertinent to several prominent aspects of the base thermal problem of the space shuttle booster is given along with a brief review of experimental results. A tractable engineering analysis, capable of predicting the power-on base pressure, base heating, and other base thermal environmental conditions, such as base gas temperature, is presented and used for an analysis of various space shuttle booster configurations. The analysis consists of a rational combination of theoretical treatments of the prominent flow interaction phenomena in the base region. These theories consider jet mixing, plume flow, axisymmetric flow effects, base injection, recirculating flow dynamics, and various modes of heat transfer. Such effects as initial boundary layer expansion at the nozzle lip, reattachment, recompression, choked vent flow, and nonisoenergetic mixing processes are included in the analysis. A unified method was developed and programmed to numerically obtain compatible solutions for the various flow field components in both flight and ground test conditions. Preliminary prediction for a 12-engine space shuttle booster base thermal environment was obtained for a typical trajectory history. Theoretical predictions were also obtained for some clustered-engine experimental conditions. Results indicate good agreement between the data and theoretical predicitons.

  18. Cash-Flow Analysis Base of the Company's Performance Evaluation

    OpenAIRE

    Radu Riana Iren; Mihalcea Lucean; Negoescu Gheorghe

    2013-01-01

    Analyses based on the study of financial flows allow coherent merge to study the financial equilibrium of the firm's performance. If static analysis to assess the financial imbalance at some point, but does not explain its evolution, in contrast, dynamic analysis highlights the evolution of financial imbalance, but does not indicate the extent of it. It follows that the two kinds of analysis are complementary and should be pursued simultaneously. Dynamic analysis is based on the concept of st...

  19. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  20. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    Science.gov (United States)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  1. ASSESSMENT OF PLASTIC FLOWS AND STOCKS IN SERBIA USING MATERIAL FLOW ANALYSIS

    Directory of Open Access Journals (Sweden)

    Goran Vujić

    2010-01-01

    Full Text Available Material flow analysis (MFA was used to assess the amounts of plastic materials flows and stocks that are annually produced, consumed, imported, exported, collected, recycled, and disposed in the landfills in Serbia. The analysis revealed that approximatelly 269,000 tons of plastic materials are directly disposed in uncontrolled landfills in Serbia without any preatretment, and that siginificant amounts of these materials have already accumulated in the landfills. The substantial amounts of landfilled plastics represent not only a loss of valuable recourses, but also pose a seriuos treath to the environment and human health, and if the trend of direct plastic landfilling is continued, Serbia will face with grave consecequnces.

  2. A Calculus for Control Flow Analysis of Security Protocols

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Nielson, Hanne Riis; Nielson, Flemming

    2004-01-01

    The design of a process calculus for anaysing security protocols is governed by three factors: how to express the security protocol in a precise and faithful manner, how to accommodate the variety of attack scenarios, and how to utilise the strengths (and limit the weaknesses) of the underlying...... analysis methodology. We pursue an analysis methodology based on control flow analysis in flow logic style and we have previously shown its ability to analyse a variety of security protocols. This paper develops a calculus, LysaNS that allows for much greater control and clarity in the description...

  3. Substance Flow Analysis of Wastes Containing Polybrominated Diphenyl Ethers

    DEFF Research Database (Denmark)

    Vyzinkarova, Dana; Brunner, Paul H.

    2013-01-01

    materials. Therefore, end-of-life (EOL) plastic materials used for construction must be separated and properly treated, for example, in a state-of-the-art municipal solid waste (MSW) incinerator. In the case of cOctaBDE, the main flows are waste electrical and electronic equipment (WEEE) and, possibly......The present article examines flows and stocks of Stockholm Convention regulated pollutants, commercial penta- and octabrominated diphenyl ether (cPentaBDE, cOctaBDE), on a city level. The goals are to (1) identify sources, pathways, and sinks of these compounds in the city of Vienna, (2) determine...... the fractions that reach final sinks, and (3) develop recommendations for waste management to ensure their minimum recycling and maximum transfer to appropriate final sinks. By means of substance flow analysis (SFA) and scenario analysis, it was found that the key flows of cPentaBDE stem from construction...

  4. Application of the load flow and random flow models for the analysis of power transmission networks

    International Nuclear Information System (INIS)

    Zio, Enrico; Piccinelli, Roberta; Delfanti, Maurizio; Olivieri, Valeria; Pozzi, Mauro

    2012-01-01

    In this paper, the classical load flow model and the random flow model are considered for analyzing the performance of power transmission networks. The analysis concerns both the system performance and the importance of the different system elements; this latter is computed by power flow and random walk betweenness centrality measures. A network system from the literature is analyzed, representing a simple electrical power transmission network. The results obtained highlight the differences between the LF “global approach” to flow dispatch and the RF local approach of randomized node-to-node load transfer. Furthermore, computationally the LF model is less consuming than the RF model but problems of convergence may arise in the LF calculation.

  5. FLOW TESTING AND ANALYSIS OF THE FSP-1 EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Hawkes, Grant L.; Jones, Warren F.; Marcum, Wade; Weiss, Aaron; Howard, Trevor

    2017-06-01

    The U.S. High Performance Research Reactor Conversions fuel development team is focused on developing and qualifying the uranium-molybdenum (U-Mo) alloy monolithic fuel to support conversion of domestic research reactors to low enriched uranium. Several previous irradiations have demonstrated the favorable behavior of the monolithic fuel. The Full Scale Plate 1 (FSP-1) fuel plate experiment will be irradiated in the northeast (NE) flux trap of the Advanced Test Reactor (ATR). This fueled experiment contains six aluminum-clad fuel plates consisting of monolithic U-Mo fuel meat. Flow testing experimentation and hydraulic analysis have been performed on the FSP-1 experiment to be irradiated in the ATR at the Idaho National Laboratory (INL). A flow test experiment mockup of the FSP-1 experiment was completed at Oregon State University. Results of several flow test experiments are compared with analyses. This paper reports and shows hydraulic analyses are nearly identical to the flow test results. A water velocity of 14.0 meters per second is targeted between the fuel plates. Comparisons between FSP-1 measurements and this target will be discussed. This flow rate dominates the flow characteristics of the experiment and model. Separate branch flows have minimal effect on the overall experiment. A square flow orifice was placed to control the flowrate through the experiment. Four different orifices were tested. A flow versus delta P curve for each orifice is reported herein. Fuel plates with depleted uranium in the fuel meat zone were used in one of the flow tests. This test was performed to evaluate flow test vibration with actual fuel meat densities and reported herein. Fuel plate deformation tests were also performed and reported.

  6. Voltage stability analysis using a modified continuation load flow ...

    African Journals Online (AJOL)

    This paper addresses the rising problem of identifying the voltage stability limits of load buses in a power system and how to optimally place capacitor banks for voltage stability improvement. This paper uses the concept of the continuation power flow analysis used in voltage stability analysis. It uses the modified ...

  7. A Flow-Sensitive Analysis of Privacy Properties

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming

    2007-01-01

    that information I send to some service never is leaked to another service? - unless I give my permission? We shall develop a static program analysis for the pi- calculus and show how it can be used to give privacy guarantees like the ones requested above. The analysis records the explicit information flow...

  8. Geometrical analysis of suspension flows near jamming

    Science.gov (United States)

    Wyart, Matthieu

    2012-02-01

    The viscosity of suspensions was computed early on by Einstein and Batchelor in the dilute regime. At high density however, their rheology remains mystifying. As the packing fraction increases, steric hindrance becomes dominant and particles move under stress in a more and more coordinated way. Eventually, the viscosity diverges as the suspension jams into an amorphous solid. Such a jamming transition is reminiscent of critical points: the rheology displays scaling and a diverging length scale. Jamming bear similarities with the glass transition where steric hindrance is enhanced under cooling, and where the dynamics is also observed to become more and more collective as it slows down. In all these examples, understanding the nature of the collective dynamics and the associated rheology remains a challenge. Recent progress has been made however on a related problem, the unjamming transition where a solid made of repulsive soft particles is isotropically decompressed toward vanishing pressure. In this situation various properties of the amorphous solid, such as elasticity, transport or force propagation, display scaling with the distance to threshold. Theoretically these observations can be shown to stem from the presence of soft modes in the vibrational spectrum, a result that can be extended to thermal colloidal glasses as well. Here we focus on particles driven by shear at zero temperature. We show that if hydrodynamical interactions are neglected an analogy can be made between the rheology of such a suspension and the elasticity of simple networks, building a link between the jamming and the unjamming transition. This analogy enables us to unify in a common framework key aspects of the elasticity of amorphous solids with the rheology of dense suspensions, and to relate features of the latter to the geometry of configurations visited under flow.

  9. Precessing rotating flows with additional shear: stability analysis.

    Science.gov (United States)

    Salhi, A; Cambon, C

    2009-03-01

    We consider unbounded precessing rotating flows in which vertical or horizontal shear is induced by the interaction between the solid-body rotation (with angular velocity Omega(0)) and the additional "precessing" Coriolis force (with angular velocity -epsilonOmega(0)), normal to it. A "weak" shear flow, with rate 2epsilon of the same order of the Poincaré "small" ratio epsilon , is needed for balancing the gyroscopic torque, so that the whole flow satisfies Euler's equations in the precessing frame (the so-called admissibility conditions). The base flow case with vertical shear (its cross-gradient direction is aligned with the main angular velocity) corresponds to Mahalov's [Phys. Fluids A 5, 891 (1993)] precessing infinite cylinder base flow (ignoring boundary conditions), while the base flow case with horizontal shear (its cross-gradient direction is normal to both main and precessing angular velocities) corresponds to the unbounded precessing rotating shear flow considered by Kerswell [Geophys. Astrophys. Fluid Dyn. 72, 107 (1993)]. We show that both these base flows satisfy the admissibility conditions and can support disturbances in terms of advected Fourier modes. Because the admissibility conditions cannot select one case with respect to the other, a more physical derivation is sought: Both flows are deduced from Poincaré's [Bull. Astron. 27, 321 (1910)] basic state of a precessing spheroidal container, in the limit of small epsilon . A Rapid distortion theory (RDT) type of stability analysis is then performed for the previously mentioned disturbances, for both base flows. The stability analysis of the Kerswell base flow, using Floquet's theory, is recovered, and its counterpart for the Mahalov base flow is presented. Typical growth rates are found to be the same for both flows at very small epsilon , but significant differences are obtained regarding growth rates and widths of instability bands, if larger epsilon values, up to 0.2, are considered. Finally

  10. Blood flow analysis with considering nanofluid effects in vertical channel

    Science.gov (United States)

    Noreen, S.; Rashidi, M. M.; Qasim, M.

    2017-06-01

    Manipulation of heat convection of copper particles in blood has been considered peristaltically. Two-phase flow model is used in a channel with insulating walls. Flow analysis has been approved by assuming small Reynold number and infinite length of wave. Coupled equations are solved. Numerical solution are computed for the pressure gradient, axial velocity function and temperature. Influence of attention-grabbing parameters on flow entities has been analyzed. This study can be considered as mathematical representation to the vibrance of physiological systems/tissues/organs provided with medicine.

  11. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  12. Load Flow Analysis of a 15Mva Injection Substation | Oshevire ...

    African Journals Online (AJOL)

    This load flow helps to determine the state of the power system for a given load and generation distribution. This paper presents the computer aided power flow analysis of the existing Otovwodo33/11kV distribution network using the ETAP 7.0 software. The result showed that out of 91load feeders of which 6 is out of service, ...

  13. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  14. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  15. Accounting and Financial Data Analysis Data Mining Tools

    Directory of Open Access Journals (Sweden)

    Diana Elena Codreanu

    2011-05-01

    Full Text Available Computerized accounting systems in recent years have seen an increase in complexity due to thecompetitive economic environment but with the help of data analysis solutions such as OLAP and DataMining can be a multidimensional data analysis, can detect the fraud and can discover knowledge hidden indata, ensuring such information is useful for decision making within the organization. In the literature thereare many definitions for data mining but all boils down to same idea: the process takes place to extract newinformation from large data collections, information without the aid of data mining tools would be verydifficult to obtain. Information obtained by data mining process has the advantage that only respond to thequestion of what happens but at the same time argue and show why certain things are happening. In this paperwe wish to present advanced techniques for analysis and exploitation of data stored in a multidimensionaldatabase.

  16. Finite element analysis of nonlinear creeping flows

    International Nuclear Information System (INIS)

    Loula, A.F.D.; Guerreiro, J.N.C.

    1988-12-01

    Steady-state creep problems with monotone constitutive laws are studied. Finite element approximations are constructed based on mixed Petrov-Galerkin formulations for constrained problems. Stability, convergence and a priori error estimates are proved for equal-order discontinuous stress and continuous velocity interpolations. Numerical results are presented confirming the rates of convergence predicted in the analysis and the good performance of this formulation. (author) [pt

  17. Preliminary Borehole Disposal In Medium Flow Hydrogeological Condition Using IAEA Screening Tools

    International Nuclear Information System (INIS)

    Nazran Harun; Mohd Abd Wahab Yusof; Norasalwa Zakaria; Mohd Zaidi Ibrahim; Muhammad Fathi Sujan

    2014-01-01

    A screening tool developed by International Atomic Energy Agency (IAEA) has been used to provide means of improving the capacity of Malaysian Nuclear Agency (Nuclear Malaysia) in assessing the potential sites for Borehole Disposal for Disused Sealed Radioactive Sources. It allows the isolation provided by the capsule and disposal container to be evaluated. In addition, it has a conservative model of radionuclide transport with no retardation of radionuclide. Hence, rapid decisions can be made by providing an early indication of the potential suitability of sites based on their hydro-chemical characteristics. The objective of this paper is to identify and determine the types and radionuclide activities of inventory that can be disposed in the borehole. The results of the analysis show the volume of gas doses occur from the disposal and time taken for the cement to be corroded. (author)

  18. Low flow analysis of the lower Drava River

    International Nuclear Information System (INIS)

    Mijuskovic-Svetinovic, T; Maricic, S

    2008-01-01

    Understanding the regime and the characteristics of low streamflows is of vital importance in several aspects. It is essential for the effective planning, designing, constructing, maintaining, using and managing different water management systems and structures. In addition, frequent running and assessing of estimates of low stream-flow statistics are especially important when different aspects of water quality are considered. This paper attempts to present the results of a stochastic analysis of the River Drava low flow from the gauging station, Donji Miholjac [located at rkm 77+700]. Currently, almost all specialists apply the truncation method in low-flows analysis. Taking this into consideration, it is possible to accept the definition of a low streamflow, as a period when the analysed characteristics are either, equal to or lower than the truncation level of drought. The same method has been applied in this analysis. The calculating method applied takes into account all the essential components of the afore-mentioned process. This includes a number of elements, such as the deficit, duration or the time of the occurrence of low flows, the number of times, the maximum deficit and the maximum duration of the low flows in the analysed time period. Moreover, this paper determines computational values for deficits and for the duration of low flow in different return periods.

  19. Active Flow Control and Global Stability Analysis of Separated Flow Over a NACA 0012 Airfoil

    Science.gov (United States)

    Munday, Phillip M.

    definition of the coefficient of momentum, which successfully characterizes suppression of separation and lift enhancement. The effect of angular momentum is incorporated into the modified coefficient of momentum by introducing a characteristic swirling jet velocity based on the non-dimensional swirl number. With the modified coefficient of momentum, this single value is able to categorize controlled flows into separated, transitional, and attached flows. With inadequate control input (separated flow regime), lift decreased compared to the baseline flow. Increasing the modified coefficient of momentum, flow transitions from separated to attached and accordingly results in improved aerodynamic forces. Modifying the spanwise spacing, it is shown that the minimum modified coefficient of momentum input required to begin transitioning the flow is dependent on actuator spacing. The growth (or decay) of perturbations can facilitate or inhibit the influence of flow control inputs. Biglobal stability analysis is considered to further analyze the behavior of control inputs on separated flow over a symmetric airfoil. Assuming a spanwise periodic waveform for the perturbations, the eigenvalues and eigenvectors about a base flow are solved to understand the influence of spanwise variation on the development of the flow. Two algorithms are developed and validated to solve for the eigenvalues of the flow: an algebraic eigenvalue solver (matrix based) and a time-stepping algorithm. The matrix based approach is formulated without ever storing the matrices, creating a computationally memory efficient algorithm. Increasing the Reynolds number to Re = 23,000 over a NACA 0012 airfoil, the time-stepper method is implemented due to rising computational cost of the matrix-based method. Stability analysis about the time-averaged flow is performed for spanwise wavenumbers of beta = 1/c, 10pi/ c and 20pi/c, which the latter two wavenumbers are representative of the spanwise spacing between the

  20. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  1. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  2. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  3. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  4. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  5. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Stregy, Seth [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Dasilva, Ana [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Yilmaz, Serkan [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Saha, Pradip [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States); Loewen, Eric [GE Hitachi Nuclear Energy Americas LLC, Wilmington, NC (United States)

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parameters applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.

  6. Modelling the Solid Waste Flow into Sungai Ikan Landfill Sites by Material Flow Analysis Method

    Science.gov (United States)

    Ghani, Latifah A.; Ali, Nora'aini; Hassan, Nur Syafiqah A.

    2017-12-01

    The purpose of this paper is to model the material flow of solid waste flows at Kuala Terengganu by using Material Flow Analysis (MFA) method, generated by STAN Software Analysis. Sungai Ikan Landfill has been operated for about 10 years. Average, Sungai Ikan Landfill receive an amount around 260 tons per day of solid waste. As for the variety source of the solid waste coming from, leachates that accumulated has been tested and measured. Highest reading of pH of the leachate is 8.29 which is still in the standard level before discharging the leachate to open water which pH in between 8.0-9.0. The percentages of the solid waste has been calculated and seven different types of solid waste has been segregated. That is, plastics, organic waste, paper, polystyrene, wood, fabric and can. The estimation of the solid waste that will be end as a residue are around 244 tons per day.

  7. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  8. Modeling energy technology choices. Which investment analysis tools are appropriate?

    International Nuclear Information System (INIS)

    Johnson, B.E.

    1994-01-01

    A variety of tools from modern investment theory appear to hold promise for unraveling observed energy technology investment behavior that often appears anomalous when analyzed using traditional investment analysis methods. This paper reviews the assumptions and important insights of the investment theories most commonly suggested as candidates for explaining the apparent ''energy technology investment paradox''. The applicability of each theory is considered in the light of important aspects of energy technology investment problems, such as sunk costs, uncertainty and imperfect information. The theories addressed include the capital asset pricing model, the arbitrage pricing theory, and the theory of irreversible investment. Enhanced net present value methods are also considered. (author)

  9. Schema for the LANL infrasound analysis tool, infrapy

    Energy Technology Data Exchange (ETDEWEB)

    Dannemann, Fransiska Kate [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Marcillo, Omar Eduardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-14

    The purpose of this document is to define the schema used for the operation of the infrasound analysis tool, infrapy. The tables described by this document extend the CSS3.0 or KB core schema to include information required for the operation of infrapy. This document is divided into three sections, the first being this introduction. Section two defines eight new, infrasonic data processing-specific database tables. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams.

  10. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  11. A design and performance analysis tool for superconducting RF systems

    International Nuclear Information System (INIS)

    Schilcher, T.; Simrock, S.N.; Merminga, L.; Wang, D.X.

    1997-01-01

    Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall wall plug power efficiency. Typical examples are CEBAF at the Thomas Jefferson National Accelerator Facility (Jefferson Lab) and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper the authors describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyze the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise.An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse structure and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feed forward can be added to further suppress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented

  12. Analysis on flow characteristic of nuclear heating reactor

    International Nuclear Information System (INIS)

    Jiang Shengyao; Wu Xinxin

    1997-06-01

    The experiment was carried out on the test loop HRTL-5, which simulates the geometry and system design of a 5 MW Nuclear heating reactor. The analysis was based on a one-dimensional two-phase flow drift model with conservation equations for mass, steam mass, energy and momentum. Clausius-Clapeyron equation was used for the calculation of flashing front in the riser. A set of ordinary equation, which describes the behavior of two-phase flow in the natural circulation system, was derived through integration of the above conservation equations in subcooled boiling region, bulk boiling region in the heated section and in the riser. The method of time-domain was used for the calculation. Both static and dynamic results are presented. System pressure, inlet subcooling and heat flux are varied as input parameters. The results show that, firstly, subcooled boiling in the heated section and void flashing in the riser have significant influence on the distribution of the void fraction, mass flow rate and stability of the system, especially at lower pressure, secondly, in a wide range of two-phase flow conditions, only subcooled boiling occurs in the heated section. For the designed two-phase regime operation of the 5 MW nuclear heating reactor, the temperature at the core exit has not reaches its saturation value. Thirdly, the mechanism of two-phase flow oscillation, namely, 'zero-pressure-drop', is described. In the wide range of inlet subcooling (0 K<ΔT<28 K) there exists three regions for system flow condition, namely, (1) stable two-phase flow, (2) bulk and subcooled boiling unstable flow, (3) subcooled boiling and single phase stable flow. The response of mass flow rate, after a small disturbance in the heat flux, is showed in the above inlet subcooling range, and based on it the instability map of the system is given through experiment and calculation. (3 refs., 9 figs.)

  13. A Novel Tool for High-Throughput Screening of Granulocyte-Specific Antibodies Using the Automated Flow Cytometric Granulocyte Immunofluorescence Test (Flow-GIFT

    Directory of Open Access Journals (Sweden)

    Xuan Duc Nguyen

    2011-01-01

    Full Text Available Transfusion-related acute lung injury (TRALI is a severe complication related with blood transfusion. TRALI has usually been associated with antibodies against leukocytes. The flow cytometric granulocyte immunofluorescence test (Flow-GIFT has been introduced for routine use when investigating patients and healthy blood donors. Here we describe a novel tool in the automation of the Flow-GIFT that enables a rapid screening of blood donations. We analyzed 440 sera from healthy female blood donors for the presence of granulocyte antibodies. As positive controls, 12 sera with known antibodies against anti-HNA-1a, -b, -2a; and -3a were additionally investigated. Whole-blood samples from HNA-typed donors were collected and the test cells isolated using cell sedimentation in a Ficoll density gradient. Subsequently, leukocytes were incubated with the respective serum and binding of antibodies was detected using FITC-conjugated antihuman antibody. 7-AAD was used to exclude dead cells. Pipetting steps were automated using the Biomek NXp Multichannel Automation Workstation. All samples were prepared in the 96-deep well plates and analyzed by flow cytometry. The standard granulocyte immunofluorescence test (GIFT and granulocyte agglutination test (GAT were also performed as reference methods. Sixteen sera were positive in the automated Flow-GIFT, while five of these sera were negative in the standard GIFT (anti—HNA 3a, n = 3; anti—HNA-1b, n = 1 and GAT (anti—HNA-2a, n = 1. The automated Flow-GIFT was able to detect all granulocyte antibodies, which could be only detected in GIFT in combination with GAT. In serial dilution tests, the automated Flow-GIFT detected the antibodies at higher dilutions than the reference methods GIFT and GAT. The Flow-GIFT proved to be feasible for automation. This novel high-throughput system allows an effective antigranulocyte antibody detection in a large donor population in order to prevent TRALI due to transfusion of

  14. A novel tool for high-throughput screening of granulocyte-specific antibodies using the automated flow cytometric granulocyte immunofluorescence test (Flow-GIFT).

    Science.gov (United States)

    Nguyen, Xuan Duc; Dengler, Thomas; Schulz-Linkholt, Monika; Klüter, Harald

    2011-02-03

    Transfusion-related acute lung injury (TRALI) is a severe complication related with blood transfusion. TRALI has usually been associated with antibodies against leukocytes. The flow cytometric granulocyte immunofluorescence test (Flow-GIFT) has been introduced for routine use when investigating patients and healthy blood donors. Here we describe a novel tool in the automation of the Flow-GIFT that enables a rapid screening of blood donations. We analyzed 440 sera from healthy female blood donors for the presence of granulocyte antibodies. As positive controls, 12 sera with known antibodies against anti-HNA-1a, -b, -2a; and -3a were additionally investigated. Whole-blood samples from HNA-typed donors were collected and the test cells isolated using cell sedimentation in a Ficoll density gradient. Subsequently, leukocytes were incubated with the respective serum and binding of antibodies was detected using FITC-conjugated antihuman antibody. 7-AAD was used to exclude dead cells. Pipetting steps were automated using the Biomek NXp Multichannel Automation Workstation. All samples were prepared in the 96-deep well plates and analyzed by flow cytometry. The standard granulocyte immunofluorescence test (GIFT) and granulocyte agglutination test (GAT) were also performed as reference methods. Sixteen sera were positive in the automated Flow-GIFT, while five of these sera were negative in the standard GIFT (anti-HNA 3a, n = 3; anti-HNA-1b, n = 1) and GAT (anti-HNA-2a, n = 1). The automated Flow-GIFT was able to detect all granulocyte antibodies, which could be only detected in GIFT in combination with GAT. In serial dilution tests, the automated Flow-GIFT detected the antibodies at higher dilutions than the reference methods GIFT and GAT. The Flow-GIFT proved to be feasible for automation. This novel high-throughput system allows an effective antigranulocyte antibody detection in a large donor population in order to prevent TRALI due to transfusion of blood products.

  15. Guiding Inspiratory Flow: Development of the In-Check DIAL G16, a Tool for Improving Inhaler Technique

    Directory of Open Access Journals (Sweden)

    Mark Jeremy Sanders

    2017-01-01

    Full Text Available Portable inhalers are divisible into those that deliver medication by patient triggering (pMDIs: a gentle slow inhalation and those that use the patient’s inspiratory effort as the force for deaggregation and delivery (DPIs: a stronger deeper inspiratory effort. Patient confusion and poor technique are commonplace. The use of training tools has become standard practice, and unique amongst these is an inspiratory flow meter (In-Check which is able to simulate the resistance characteristics of different inhalers and, thereby, guide the patient to the correct effort. In-Check’s origins lie in the 1960s peak expiratory flow meters, the development of the Mini-Wright peak flow meter, and inspiratory flow assessment via the nose during the 1970s–1980s. The current device (In-Check DIAL G16 is the third iteration of the original 1998 training tool, with detailed and ongoing assessments of all common inhaler resistances (including combination and breath-actuated inhaler types summarised into resistance ranges that are preset within the device. The device works by interpolating one of six ranges with the inspiratory effort. Use of the tool has been shown to be contributory to significant improvements in asthma care and control, and it is being advocated for assessment and training in irreversible lung disease.

  16. Adapting the capacities and vulnerabilities approach: a gender analysis tool.

    Science.gov (United States)

    Birks, Lauren; Powell, Christopher; Hatfield, Jennifer

    2017-12-01

    Gender analysis methodology is increasingly being considered as essential to health research because 'women's social, economic and political status undermine their ability to protect and promote their own physical, emotional and mental health, including their effective use of health information and services' {World Health Organization [Gender Analysis in Health: a review of selected tools. 2003; www.who.int/gender/documents/en/Gender. pdf (20 February 2008, date last accessed)]}. By examining gendered roles, responsibilities and norms through the lens of gender analysis, we can develop an in-depth understanding of social power differentials, and be better able to address gender inequalities and inequities within institutions and between men and women. When conducting gender analysis, tools and frameworks may help to aid community engagement and to provide a framework to ensure that relevant gendered nuances are assessed. The capacities and vulnerabilities approach (CVA) is one such gender analysis framework that critically considers gender and its associated roles, responsibilities and power dynamics in a particular community and seeks to meet a social need of that particular community. Although the original intent of the CVA was to guide humanitarian intervention and disaster preparedness, we adapted this framework to a different context, which focuses on identifying and addressing emerging problems and social issues in a particular community or area that affect their specific needs, such as an infectious disease outbreak or difficulty accessing health information and resources. We provide an example of our CVA adaptation, which served to facilitate a better understanding of how health-related disparities affect Maasai women in a remote, resource-poor setting in Northern Tanzania. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Bistable flow spectral analysis. Repercussions on jet pumps

    International Nuclear Information System (INIS)

    Gavilan Moreno, C.J.

    2011-01-01

    Highlights: → The most important thing in this paper, is the spectral characterization of the bistable flow in a Nuclear Power Plant. → This paper goes deeper in the effect of the bistable flow over the jet pump and the induced vibrations. → The jet pump frequencies are very close to natural jet pump frequencies, in the 3rd and 6th mode. - Abstract: There have been many attempts at characterizing and predicting bistable flow in boiling water reactors (BWRs). Nevertheless, in most cases the results have only managed to develop models that analytically reproduce the phenomenon (). Modeling has been forensic in all cases, while the capacity of the model focus on determining the exclusion areas on the recirculation flow map. The bistability process is known by its effects given there is no clear definition of its causal process. In the 1980s, Hitachi technicians () managed to reproduce bistable flow in the laboratory by means of pipe geometry, similar to that which is found in recirculation loops. The result was that the low flow pattern is formed by the appearance of a quasi stationary, helicoidal vortex in the recirculation collector's branches. This vortex creates greater frictional losses than regions without vortices, at the same discharge pressure. Neither the behavior nor the dynamics of these vortices were characterized in this paper. The aim of this paper is to characterize these vortices in such a way as to enable them to provide their own frequencies and their later effect on the jet pumps. The methodology used in this study is similar to the one used previously when analyzing the bistable flow in tube arrays with cross flow (). The method employed makes use of the power spectral density function. What differs is the field of application. We will analyze a Loop B with a bistable flow and compare the high and low flow situations. The same analysis will also be carried out on the loop that has not developed the bistable flow (Loop A) at the same moments

  18. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  19. ANALYSIS OF FINANCIAL FLOWS IN FOOD INDURSTRY ENTERPRISES

    OpenAIRE

    Iurie SPIVACENCO

    2015-01-01

    In the present study it was used the analysis of food industry and the financial flows generated by them. The analysis was based on information from the financial statements of these entities, and the study of evolution: food industry output, number of enterprises and employees in food industry, import and export of food production. Following the undertaken analysis are highlighted some shortcomings and made some concrete proposals need to be considered in the sustainable development of the f...

  20. Evolution of - and Core-Dominated Lava Flows Using Scaling Analysis

    Science.gov (United States)

    Castruccio, A.; Rust, A.; Sparks, R. S.

    2010-12-01

    We investigated the front evolution of simple lava flows on a slope using scaling arguments. For the retarding force acting against gravity, we analyzed three different cases: a flow controlled by a Newtonian viscosity, a flow controlled by the yield strength of a diffusively growing crust and a flow controlled by its core yield strength. These models were tested using previously published data of front evolution and volume discharge of 10 lava flow eruptions from 6 different volcanoes. Our analysis suggests that for basaltic eruptions with high effusion rate and low crystal content, (Hawaiian eruptions), the best fit of the data is with a Newtonian viscosity. For basaltic eruptions with lower effusion rates (Etna eruptions) or long duration andesitic eruptions (Lonquimay eruption, Chile) the flow is controlled by the yield strength of a growing crust. Finally, for very high crystalline lavas (Colima, Santiaguito) the flow is controlled by its core yield strength. The order of magnitude of the viscosities from our analysis is in the same range as previous studies using field measurements on the same lavas. The yield strength values for the growing crust and core of the flow are similar and with an order of magnitude of 10^5 Pa. This number is similar to yield strength values found in lava domes by different authors. The consistency of yield strength ~10^5 Pa is because larger stresses cause fracturing of very crystalline magma, which drastically reduces its effective strength. Furthermore, we used a 2-D analysis of a Bingham fluid flow on a slope to conclude that, for lower yield strength values, the flow is controlled mainly by its plastic viscosity and the lava can be effectively modelled as Newtonian. Our analysis provides a simple tool to evaluate the main controlling forces in the evolution of a lava flow, as well as the magnitude of its rheological properties, for eruptions of different compositions and conditions and may be useful to predict the evolution of

  1. Lagrangian Flow Network: a new tool to evaluate connectivity and understand the structural complexity of marine populations

    Science.gov (United States)

    Rossi, V.; Dubois, M.; Ser-Giacomi, E.; Monroy, P.; Lopez, C.; Hernandez-Garcia, E.

    2016-02-01

    Assessing the spatial structure and dynamics of marine populations is still a major challenge for ecologists. The necessity to manage marine resources from a large-scale perspective and considering the whole ecosystem is now recognized but the absence of appropriate tools to address these objectives limits the implementation of globally pertinent conservation planning. Inspired from Network Theory, we present a new methodological framework called Lagrangian Flow Network which allows a systematic characterization of multi-scale dispersal and connectivity of early life history stages of marine organisms. The network is constructed by subdividing the basin into an ensemble of equal-area subregions which are interconnected through the transport of propagules by ocean currents. The present version allows the identification of hydrodynamical provinces and the computation of various connectivity proxies measuring retention and exchange of larvae. Due to our spatial discretization and subsequent network representation, as well as our Lagrangian approach, further methodological improvements are handily accessible. These future developments include a parametrization of habitat patchiness, the implementation of realistic larval traits and the consideration of abiotic variables (e.g. temperature, salinity, planktonic resources...) and their effects on larval production and survival. While the model is potentially tunable to any species whose biological traits and ecological preferences are precisely known, it can also be used in a more generic configuration by efficient computing and analysis of a large number of experiments with relevant ecological parameters. It permits a better characterization of population connectivity at multiple scales and it informs its ecological and managerial interpretations.

  2. Analysis of the brazilian scientific production about information flows

    Directory of Open Access Journals (Sweden)

    Danielly Oliveira Inomata

    2015-07-01

    Full Text Available Objective. This paper presents and discuss the concepts, contexts and applications involving information flows in organizations. Method. Systematic review, followed by a bibliometric analysis and system analysis. The systematic review aimed to search for, evaluate and review evidence about the research topic. The systematic review process comprised the following steps: 1 definition of keywords, 2 systematic review, 3 exploration and analysis of articles and 4 comparison and consolidation of results. Results. A bibliometric analysis aimed to provide a statement of the relevance of articles where the authors, dates of publications, citation index, and periodic keywords with higher occurrence. Conclusions. As survey results confirms the emphasis on information featured in the knowledge management process, and advancing years, it seems that the emphasis is on networks, ie, studies are turning to the operationalization and analysis of flows information networks. The literature produced demonstrates the relationship of information flow with its management, applied to different organizational contexts, including showing new trends in information science as the study and analysis of information flow in networks.

  3. Using FlowLab, an educational computational fluid dynamics tool, to perform a comparative study of turbulence models

    International Nuclear Information System (INIS)

    Parihar, A.; Kulkarni, A.; Stern, F.; Xing, T.; Moeykens, S.

    2005-01-01

    Flow over an Ahmed body is a key benchmark case for validating the complex turbulent flow field around vehicles. In spite of the simple geometry, the flow field around an Ahmed body retains critical features of real, external vehicular flow. The present study is an attempt to implement such a real life example into the course curriculum for undergraduate engineers. FlowLab, which is a Computational Fluid Dynamics (CFD) tool developed by Fluent Inc. for use in engineering education, allows students to conduct interactive application studies. This paper presents a synopsis of FlowLab, a description of one FlowLab exercise, and an overview of the educational experience gained by students through using FlowLab, which is understood through student surveys and examinations. FlowLab-based CFD exercises were implemented into 57:020 Mechanics of Fluids and Transport Processes and 58:160 Intermediate Mechanics of Fluids courses at the University of Iowa in the fall of 2004, although this report focuses only on experiences with the Ahmed body exercise, which was used only in the intermediate-level fluids class, 58:160. This exercise was developed under National Science Foundation funding by the authors of this paper. The focus of this study does not include validating the various turbulence models used for the Ahmed body simulation, because a two-dimensional simplification was applied. With the two-dimensional simplification, students may setup, run, and post process this model in a 50 minute class period using a single-CPU PC, as required for the 58:160 class at the University of Iowa. It is educational for students to understand the implication of a two- dimensional approximation for essentially a three-dimensional flow field, along with the consequent variation in both qualitative and quantitative results. Additionally, through this exercise, students may realize that the choice of the respective turbulence model will affect simulation prediction. (author)

  4. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    International Nuclear Information System (INIS)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook

    2007-08-01

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the modeling

  5. Screening of Gas-Cooled Reactor Thermal-Hydraulic and Safety Analysis Tools and Experimental Database

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Kim, Min Hwan; Lee, Seung Wook (and others)

    2007-08-15

    This report is a final report of I-NERI Project, 'Screening of Gas-cooled Reactor Thermal Hydraulic and Safety Analysis Tools and Experimental Database 'jointly carried out by KAERI, ANL and INL. In this study, we developed the basic technologies required to develop and validate the VHTR TH/safety analysis tools and evaluated the TH/safety database information. The research tasks consist of; 1) code qualification methodology (INL), 2) high-level PIRTs for major nucleus set of events (KAERI, ANL, INL), 3) initial scaling and scoping analysis (ANL, KAERI, INL), 4) filtering of TH/safety tools (KAERI, INL), 5) evaluation of TH/safety database information (KAERI, INL, ANL) and 6) key scoping analysis (KAERI). The code qualification methodology identifies the role of PIRTs in the R and D process and the bottom-up and top-down code validation methods. Since the design of VHTR is still evolving, we generated the high-level PIRTs referencing 600MWth block-type GT-MHR and 400MWth pebble-type PBMR. Nucleus set of events that represents the VHTR safety and operational transients consists of the enveloping scenarios of HPCC (high pressure conduction cooling: loss of primary flow), LPCC/Air-Ingress (low pressure conduction cooling: loss of coolant), LC (load changes: power maneuvering), ATWS (anticipated transients without scram: reactivity insertion), WS (water ingress: water-interfacing system break) and HU (hydrogen-side upset: loss of heat sink). The initial scaling analysis defines dimensionless parameters that need to be reflected in mixed convection modeling and the initial scoping analysis provided the reference system transients used in the PIRTs generation. For the PIRTs phenomena, we evaluated the modeling capability of the candidate TH/safety tools and derived a model improvement need. By surveying and evaluating the TH/safety database information, a tools V and V matrix has been developed. Through the key scoping analysis using available database, the

  6. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  7. Typing Local Control and State Using Flow Analysis

    Science.gov (United States)

    Guha, Arjun; Saftoiu, Claudiu; Krishnamurthi, Shriram

    Programs written in scripting languages employ idioms that confound conventional type systems. In this paper, we highlight one important set of related idioms: the use of local control and state to reason informally about types. To address these idioms, we formalize run-time tags and their relationship to types, and use these to present a novel strategy to integrate typing with flow analysis in a modular way. We demonstrate that in our separation of typing and flow analysis, each component remains conventional, their composition is simple, but the result can handle these idioms better than either one alone.

  8. Analysis tools for the interplay between genome layout and regulation.

    Science.gov (United States)

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  9. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  10. A new tool for accelerator system modeling and analysis

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-01-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated

  11. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  12. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  13. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    Lu Qiming; Biery, Kurt A; Kowalkowski, James B

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  14. Net energy analysis - powerful tool for selecting elective power options

    Energy Technology Data Exchange (ETDEWEB)

    Baron, S. [Brookhaven National Laboratory, Upton, NY (United States)

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  15. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  16. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    Science.gov (United States)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  17. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    International Nuclear Information System (INIS)

    Shamir, Lior

    2011-01-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ∼10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  18. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    Science.gov (United States)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  19. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    Science.gov (United States)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  20. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  1. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  2. Simple Strategic Analysis Tools at SMEs in Ecuador

    Directory of Open Access Journals (Sweden)

    Diego H. Álvarez Peralta

    2015-06-01

    Full Text Available This article explores the possible applications of Strategic Analysis Tools (SAT in SMEs located in emerging countries such as Ecuador (where there are no formal studies on the subject. It is intended to analyze if whether or not it is feasible to effectively apply a set of proposed tools to guide mental map decisions of executives when decisions on strategy have to be made. Through an in-depth review of the state of the art in regards to SAT and interviews performed to main participants such as chambers and executives of different firms, it is shown the feasibility of their application. This analysis is complemented with specialists´ interviews to deepen our insights and obtaining valid conclusions. Our conclusion is that SMEs can smoothly develop and apply an appropriate set of SAT when opting for very relevant choices. However, there are some inconveniences to be solved which are connected with resources (such as peoples’ abilities and technology and behavioral (cultural factors and methodological processes.Once these barriers are knocked down, it would be more likely to enrich current approaches to make strategic decisions even more effective. This is a qualitative investigation and the research design is not experimental (among them it is transversal as it relates to a specific moment in time.

  3. Basic models in transitory analysis in biphasic flows

    International Nuclear Information System (INIS)

    Gonzalez S, J.M.

    1992-02-01

    The two-phase flow but studied and possibly the more complex, is the one integrated by gas-liquid mixtures. These flows are with frequency inside systems and equipment related with the chemical industry, that of the petroleum and in the one dedicated to the electric energy generation, being inside this last, in particular in the nuclear and of geothermal areas, those that but have motivated to the detailed and complete analysis of the behavior of the two-phase flows. The present report, it tries to analyze inside the nuclear reactor area, the emergence of some abnormal operation situations, related exclusively with the two-phase flow in gas-liquid mixtures. (Author)

  4. Tools for Genomic and Transcriptomic Analysis of Microbes at Single-Cell Level

    Directory of Open Access Journals (Sweden)

    Zixi Chen

    2017-09-01

    Full Text Available Microbiologists traditionally study population rather than individual cells, as it is generally assumed that the status of individual cells will be similar to that observed in the population. However, the recent studies have shown that the individual behavior of each single cell could be quite different from that of the whole population, suggesting the importance of extending traditional microbiology studies to single-cell level. With recent technological advances, such as flow cytometry, next-generation sequencing (NGS, and microspectroscopy, single-cell microbiology has greatly enhanced the understanding of individuality and heterogeneity of microbes in many biological systems. Notably, the application of multiple ‘omics’ in single-cell analysis has shed light on how individual cells perceive, respond, and adapt to the environment, how heterogeneity arises under external stress and finally determines the fate of the whole population, and how microbes survive under natural conditions. As single-cell analysis involves no axenic cultivation of target microorganism, it has also been demonstrated as a valuable tool for dissecting the microbial ‘dark matter.’ In this review, current state-of-the-art tools and methods for genomic and transcriptomic analysis of microbes at single-cell level were critically summarized, including single-cell isolation methods and experimental strategies of single-cell analysis with NGS. In addition, perspectives on the future trends of technology development in the field of single-cell analysis was also presented.

  5. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  6. Process Measurement Deviation Analysis for Flow Rate due to Miscalibration

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Eunsuk; Kim, Byung Rae; Jeong, Seog Hwan; Choi, Ji Hye; Shin, Yong Chul; Yun, Jae Hee [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    An analysis was initiated to identify the root cause, and the exemption of high static line pressure correction to differential pressure (DP) transmitters was one of the major deviation factors. Also the miscalibrated DP transmitter range was identified as another major deviation factor. This paper presents considerations to be incorporated in the process flow measurement instrumentation calibration and the analysis results identified that the DP flow transmitter electrical output decreased by 3%. Thereafter, flow rate indication decreased by 1.9% resulting from the high static line pressure correction exemption and measurement range miscalibration. After re-calibration, the flow rate indication increased by 1.9%, which is consistent with the analysis result. This paper presents the brief calibration procedures for Rosemount DP flow transmitter, and analyzes possible three cases of measurement deviation including error and cause. Generally, the DP transmitter is required to be calibrated with precise process input range according to the calibration procedure provided for specific DP transmitter. Especially, in case of the DP transmitter installed in high static line pressure, it is important to correct the high static line pressure effect to avoid the inherent systematic error for Rosemount DP transmitter. Otherwise, failure to notice the correction may lead to indicating deviation from actual value.

  7. Methods and Models for Capacity and Patient Flow Analysis in Hospital Sector

    DEFF Research Database (Denmark)

    Kozlowski, Dawid

    at a private hospital in Denmark or at a hospital abroad if the public healthcare system is unable to provide treatment within the stated maximum waiting time guarantee. A queue modelling approach is used to analyse the potential negative consequences of the policy on the utilization of public hospital......This thesis is concerned about the novel applications of operations research methods for capacity and flow analysis within hospital sector. The first part of the thesis presents a detailed Discrete-Event Simulation (DES) model that has been developed as an analytical tool designed to facilitate...... by an improved patient flow. The specially developed structure of the model facilitates its reuse at different units, with no advanced modelling skills required in day-to-day use. This feature amplifies the usefulness of DES in conducting comprehensive patient flow analyses at any department with emergency...

  8. A study on heat-flow analysis of friction stir welding on a rotation affected zone

    International Nuclear Information System (INIS)

    Kang, Sung Wook; Jang, Beom Seon; Kim, Jae Woong

    2014-01-01

    In recent years, as interest in environmental protection and energy conservation rose, technological development for lightweight efficiency of transport equipment, such as aircrafts, railcars, automobiles and vessels, have been briskly proceeding. This has led to an expansion of the application of lightweight alloys such as aluminum and magnesium. For the welding of these lightweight alloys, friction stir welding has been in development by many researchers. Heat-flow analysis of friction stir welding is one such research. The flow and energy equation is solved using the computational fluid dynamic commercial program 'Fluent'. In this study, a rotation affected zone concept is imposed. The rotation affected zone is a constant volume. In this volume, flow is rotated the same as the tool rotation speed and so plastic dissipation occurs. Through this simulation, the temperature distribution results are calculated and the simulation results are compared with the experimental results.

  9. ELECTRA © Launch and Re-Entry Safety Analysis Tool

    Science.gov (United States)

    Lazare, B.; Arnal, M. H.; Aussilhou, C.; Blazquez, A.; Chemama, F.

    2010-09-01

    French Space Operation Act gives as prime objective to National Technical Regulations to protect people, properties, public health and environment. In this frame, an independent technical assessment of French space operation is delegated to CNES. To perform this task and also for his owns operations CNES needs efficient state-of-the-art tools for evaluating risks. The development of the ELECTRA© tool, undertaken in 2007, meets the requirement for precise quantification of the risks involved in launching and re-entry of spacecraft. The ELECTRA© project draws on the proven expertise of CNES technical centers in the field of flight analysis and safety, spaceflight dynamics and the design of spacecraft. The ELECTRA© tool was specifically designed to evaluate the risks involved in the re-entry and return to Earth of all or part of a spacecraft. It will also be used for locating and visualizing nominal or accidental re-entry zones while comparing them with suitable geographic data such as population density, urban areas, and shipping lines, among others. The method chosen for ELECTRA© consists of two main steps: calculating the possible reentry trajectories for each fragment after the spacecraft breaks up; calculating the risks while taking into account the energy of the fragments, the population density and protection afforded by buildings. For launch operations and active re-entry, the risk calculation will be weighted by the probability of instantaneous failure of the spacecraft and integrated for the whole trajectory. ELECTRA©’s development is today at the end of the validation phase, last step before delivery to users. Validation process has been performed in different ways: numerical application way for the risk formulation; benchmarking process for casualty area, level of energy of the fragments entries and level of protection housing module; best practices in space transportation industries concerning dependability evaluation; benchmarking process for

  10. Flow Rates in Liquid Chromatography, Gas Chromatography and Supercritical Fluid Chromatography: A Tool for Optimization

    Directory of Open Access Journals (Sweden)

    Joris Meurs

    2016-08-01

    Full Text Available This paper aimed to develop a standalone application for optimizing flow rates in liquid chromatography (LC, gas chromatography (GC and supercritical fluid chromatography (SFC. To do so, Van Deemter’s equation, Knox’ equation and Golay’s equation were implemented in a MATLAB script and subsequently a graphical user interface (GUI was created. The application will show the optimal flow rate or linear velocity and the corresponding plate height for the set input parameters. Furthermore, a plot will be shown in which the plate height is plotted against the linear flow velocity. Hence, this application will give optimized flow rates for any set conditions with minimal effort.

  11. Slip analysis of squeezing flow using doubly stratified fluid

    Science.gov (United States)

    Ahmad, S.; Farooq, M.; Javed, M.; Anjum, Aisha

    2018-06-01

    The non-isothermal flow is modeled and explored for squeezed fluid. The influence of velocity, thermal and solutal slip effects on transport features of squeezed fluid are analyzed through Darcy porous channel when fluid is moving due to squeezing of upper plate towards the stretchable lower plate. Dual stratification effects are illustrated in transport equations. A similarity analysis is performed and reduced governing flow equations are solved using moderated and an efficient convergent approach i.e. Homotopic technique. The significant effects of physical emerging parameters on flow velocity, temperature and fluid concentration are reporting through various plots. Graphical explanations for drag force, Nusselt and Sherwood numbers are stated and examined. The results reveal that minimum velocity field occurs near the plate, whereas it increases far away from the plate for strong velocity slip parameter. Furthermore, temperature and fluid concentration significantly decreases with increased slip effects. The current analysis is applicable in some advanced technological processes and industrial fluid mechanics.

  12. The flow analysis of supercavitating cascade by linear theory

    Energy Technology Data Exchange (ETDEWEB)

    Park, E.T. [Sung Kyun Kwan Univ., Seoul (Korea, Republic of); Hwang, Y. [Seoul National Univ., Seoul (Korea, Republic of)

    1996-06-01

    In order to reduce damages due to cavitation effects and to improve performance of fluid machinery, supercavitation around the cascade and the hydraulic characteristics of supercavitating cascade must be analyzed accurately. And the study on the effects of cavitation on fluid machinery and analysis on the performances of supercavitating hydrofoil through various elements governing flow field are critically important. In this study comparison of experiment results with the computed results of linear theory using singularity method was obtainable. Specially singularity points like sources and vortexes on hydrofoil and freestreamline were distributed to analyze two dimensional flow field of supercavitating cascade, and governing equations of flow field were derived and hydraulic characteristics of cascade were calculated by numerical analysis of the governing equations. 7 refs., 6 figs.

  13. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  14. Chimera Grid Tools

    Science.gov (United States)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  15. The Test for Flow Characteristics of Tubular Fuel Assembly(II) - Experimental results and CFD analysis

    International Nuclear Information System (INIS)

    Park, Jong Hark; Chae, H. T.; Park, C.; Kim, H.

    2006-12-01

    A test facility had been established for the experiment of velocity distribution and pressure drop in a tubular fuel. A basic test had been conducted to examine the performance of the test loop and to verify the accuracy of measurement by pitot-tube. In this report, test results and CFD analysis for the hydraulic characteristics of a tubular fuel, following the previous tests, are described. Coolant velocities in all channels were measured using pitot-tube and the effect of flow rate change on the velocity distribution was also examined. The pressure drop through the tubular fuel was measured for various flow rates in range of 1 kg/s to 21 kg/s to obtain a correlation of pressure drop with variation of flow rate. In addition, a CFD(Computational Fluid Dynamics) analysis was also done to find out the hydraulic characteristics of tubular fuel such as velocity distribution and pressure drop. As the results of CFD analysis can give us a detail insight on coolant flow in the tubular fuel, the CFD method is a very useful tool to understand the flow structure and phenomena induced by fluid flow. The CFX-10, a commercial CFD code, was used in this study. The two results by the experiment and the CFD analysis were investigated and compared with each other. Overall trend of velocity distribution by CFD analysis was somewhat different from that of experiment, but it would be reasonable considering measurement uncertainties. The CFD prediction for pressure drop of a tubular fuel shows a tolerably good agreement with experiment within 8% difference

  16. Flow cytometric analysis of microbial contamination in food industry technological lines--initial study.

    Science.gov (United States)

    Józwa, Wojciech; Czaczyk, Katarzyna

    2012-04-02

    Flow cytometry constitutes an alternative for traditional methods of microorganisms identification and analysis, including methods requiring cultivation step. It enables the detection of pathogens and other microorganisms contaminants without the need to culture microbial cells meaning that the sample (water, waste or food e.g. milk, wine, beer) may be analysed directly. This leads to a significant reduction of time required for analysis allowing monitoring of production processes and immediate reaction in case of contamination or any disruption occurs. Apart from the analysis of raw materials or products on different stages of manufacturing process, the flow cytometry seems to constitute an ideal tool for the assessment of microbial contamination on the surface of technological lines. In the present work samples comprising smears from 3 different surfaces of technological lines from fruit and vegetable processing company from Greater Poland were analysed directly with flow cytometer. The measured parameters were forward and side scatter of laser light signals allowing the estimation of microbial cell contents in each sample. Flow cytometric analysis of the surface of food industry production lines enable the preliminary evaluation of microbial contamination within few minutes from the moment of sample arrival without the need of sample pretreatment. The presented method of fl ow cytometric initial evaluation of microbial state of food industry technological lines demonstrated its potential for developing a robust, routine method for the rapid and labor-saving detection of microbial contamination in food industry.

  17. Electric capacitance tomography and two-phase flow for the nuclear reactor safety analysis

    International Nuclear Information System (INIS)

    Lee, Jae Young

    2008-01-01

    Recently electric capacitance tomography has been developed to be used in the analysis of two-phase flow. Although its electric field is not focused as the hard ray tomography such as the X-ray or gamma ray, its convenience of easy access to the system and easy maintenance due to no requirement of radiation shielding benefits us in its application in the two-phase flow study, one of important area in the nuclear safety analysis. In the present paper, the practical technologies in the electric capacitance tomography are represented in both parts of hardware and software. In the software part, both forward problem and inverse problem are discussed and the method of regularization. In the hardware part, the brief discussion of the electronics circuits is made which provides femto farad resolution with a reasonable speed (150 frame/sec for 16 electrodes). Some representative ideal cases are studied to demonstrate its potential capability for the two-phase flow analysis. Also, some variations of the tomography such as axial tomography, and three dimensional tomography are discussed. It was found that the present ECT is expected to become a useful tool to understand the complicated three dimensional two-phase flow which may be an important feature to be equipped by the safety analysis codes. (author)

  18. Mathematical annuity models application in cash flow analysis ...

    African Journals Online (AJOL)

    Mathematical annuity models application in cash flow analysis. ... We also compare the cost efficiency between Amortisation and Sinking fund loan repayment as prevalent in financial institutions. Keywords: Annuity, Amortisation, Sinking Fund, Present and Future Value Annuity, Maturity date and Redemption value.

  19. Automated injection of slurry samples in flow-injection analysis

    NARCIS (Netherlands)

    Hulsman, M.H.F.M.; Hulsman, M.; Bos, M.; van der Linden, W.E.

    1996-01-01

    Two types of injectors are described for introducing solid samples as slurries in flow analysis systems. A time-based and a volume-based injector based on multitube solenoid pinch valves were built, both can be characterized as hydrodynamic injectors. Reproducibility of the injections of dispersed

  20. Discretizations in isogeometric analysis of Navier-Stokes flow

    DEFF Research Database (Denmark)

    Nielsen, Peter Nørtoft; Gersborg, Allan Roulund; Gravesen, Jens

    2011-01-01

    This paper deals with isogeometric analysis of 2-dimensional, steady state, incompressible Navier-Stokes flow subjected to Dirichlet boundary conditions. We present a detailed description of the numerical method used to solve the boundary value problem. Numerical inf-sup stability tests...

  1. Flow Injection Analysis: A Revolution in Modern Analytical Chemistry

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    1996-01-01

    A review is made of the fundamentals of Flow Injection Analysis (FIA), and the versatility and applicability of this analytical concept is demonstrated by a series of examples, comprizing the use of different types of FIA-manifolds and various detection devices (optical and electrochemical...

  2. FACTORIAL CORRESPONDENCES ANALYSIS – A TOOL IN TOURISM MOTIVATION RESEARCH

    Directory of Open Access Journals (Sweden)

    Ion Danut I. JUGANARU

    2016-05-01

    Full Text Available This study aims at analyzing the distribution of tourist flows in 2014, from 25 European countries, on three main categories of trip purposes, and assumes that there are differences or similarities between the tourists’ countries of residence and their trip purposes. "Purpose'' is a multidimensional concept used in marketing research, most often for understanding consumer behavior, and for identifying market segments or customer target groups, reunited in terms of similar characteristics. Being aware that the decision of choice/ purchase is based on purposes, their knowledge proves useful in designing strategies to increase the satisfaction level provided to the customer. The statistical method used in this paper is the factorial correspondences analysis. In our opinion, the identification, by this method, of the existence of differences or similarities between the tourists’ countries of residence and their trip purposes can represent a useful step in studying the tourism market and the choice/ reformulation of strategies.

  3. Kinetic analysis of thermally relativistic flow with dissipation

    International Nuclear Information System (INIS)

    Yano, Ryosuke; Suzuki, Kojiro

    2011-01-01

    Nonequilibrium flow of thermally relativistic matter with dissipation is considered in the framework of the relativistic kinetic theory. As an object of the analysis, the supersonic rarefied flow of thermally relativistic matter around the triangle prism is analyzed using the Anderson-Witting model. Obtained numerical results indicate that the flow field changes in accordance with the flow velocity and temperature of the uniform flow owing to both effects derived from the Lorentz contraction and thermally relativistic effects, even when the Mach number of the uniform flow is fixed. The profiles of the heat flux along the stagnation streamline can be approximated on the basis of the relativistic Navier-Stokes-Fourier (NSF) law except for a strong nonequilibrium regime such as the middle of the shock wave and the vicinity of the wall, whereas the profile of the heat flux behind the triangle prism cannot be approximated on the basis of the relativistic NSF law owing to rarefied effects via the expansion behind the triangle prism. Additionally, the heat flux via the gradient of the static pressure is non-negligible owing to thermally relativistic effects. The profile of the dynamic pressure is different from that approximated on the basis of the NSF law, which is obtained by the Eckart decomposition. Finally, variations of convections of the mass and momentum owing to the effects derived from the Lorentz contraction and thermally relativistic effects are numerically confirmed.

  4. Analysis of flow induced vibration in heat exchangers

    Energy Technology Data Exchange (ETDEWEB)

    Beek, A.W. van [Institute for Mechanical Constructions TNO, Delft (Netherlands)

    1977-12-01

    A description will be given of three different types of heat exchangers developed by the Dutch Nuclear Industry Group ''Neratoom'' in cooperation with TNO for the sodium-cooled fast breeder reactor SNR-300 at Kalkar. Moreover, the research related with flow induced vibrations carried out by TNO (Organization for Applied Scientific Research) will be presented. The flow induced forces on the tubes of the straight-tube steam generators were measured at the inlet and outlet section where partial crossflow occurs. With the measured flow induced forces the response of a tube was calculated as a function of the tube-to-supportbush clearances taking into account the non-linear damping effects from the sodium. The theoretical results showed that for this particular design no tube impact damage is to be expected which was confirmed later by a full scale experiment. Special attention will be devoted to the steam generator with helical-coil tube-bundles, where the sodium flows in a counter cross-flow over the tube-bundle. Extensive measurements of the power spectra of the flow induced forces were carried out since no information could be found in the literature. The vibration analysis will be presented and vibration modes of the entire bundle will be compared with experimentally obtained results. Finally a description of the vibration tests to be carried out on the intermediate heat exchanger (IHX) will be presented. (author)

  5. Analysis of flow induced vibration in heat exchangers

    International Nuclear Information System (INIS)

    Beek, A.W. van

    1977-01-01

    A description will be given of three different types of heat exchangers developed by the Dutch Nuclear Industry Group ''Neratoom'' in cooperation with TNO for the sodium-cooled fast breeder reactor SNR-300 at Kalkar. Moreover, the research related with flow induced vibrations carried out by TNO (Organization for Applied Scientific Research) will be presented. The flow induced forces on the tubes of the straight-tube steam generators were measured at the inlet and outlet section where partial crossflow occurs. With the measured flow induced forces the response of a tube was calculated as a function of the tube-to-supportbush clearances taking into account the non-linear damping effects from the sodium. The theoretical results showed that for this particular design no tube impact damage is to be expected which was confirmed later by a full scale experiment. Special attention will be devoted to the steam generator with helical-coil tube-bundles, where the sodium flows in a counter cross-flow over the tube-bundle. Extensive measurements of the power spectra of the flow induced forces were carried out since no information could be found in the literature. The vibration analysis will be presented and vibration modes of the entire bundle will be compared with experimentally obtained results. Finally a description of the vibration tests to be carried out on the intermediate heat exchanger (IHX) will be presented. (author)

  6. Stability Analysis of Reactive Multiphase Slug Flows in Microchannels

    Directory of Open Access Journals (Sweden)

    Alejandro A. Munera Parra

    2014-05-01

    Full Text Available Conducting multiphase reactions in micro-reactors is a promising strategy for intensifying chemical and biochemical processes. A major unresolved challenge is to exploit the considerable benefits offered by micro-scale operation for industrial scale throughputs by numbering-up whilst retaining the underlying advantageous flow characteristics of the single channel system in multiple parallel channels. Fabrication and installation tolerances in the individual micro-channels result in different pressure losses and, thus, a fluid maldistribution. In this work, an additional source of maldistribution, namely the flow multiplicities, which can arise in a multiphase reactive or extractive flow in otherwise identical micro-channels, was investigated. A detailed experimental and theoretical analysis of the flow stability with and without reaction for both gas-liquid and liquid-liquid slug flow has been developed. The model has been validated using the extraction of acetic acid from n-heptane with the ionic liquid 1-Ethyl-3-methylimidazolium ethyl sulfate. The results clearly demonstrate that the coupling between flow structure, the extent of reaction/extraction and pressure drop can result in multiple operating states, thus, necessitating an active measurement and control concept to ensure uniform behavior and optimal performance.

  7. Analysis of Urine Flow in Three Different Ureter Models

    Directory of Open Access Journals (Sweden)

    Kyung-Wuk Kim

    2017-01-01

    Full Text Available The ureter provides a way for urine to flow from the kidney to the bladder. Peristalsis in the ureter partially forces the urine flow, along with hydrostatic pressure. Ureteral diseases and a double J stent, which is commonly inserted in a ureteral stenosis or occlusion, disturb normal peristalsis. Ineffective or no peristalsis could make the contour of the ureter a tube, a funnel, or a combination of the two. In this study, we investigated urine flow in the abnormal situation. We made three different, curved tubular, funnel-shaped, and undulated ureter models that were based on human anatomy. A numerical analysis of the urine flow rate and pattern in the ureter was performed for a combination of the three different ureters, with and without a ureteral stenosis and with four different types of double J stents. The three ureters showed a difference in urine flow rate and pattern. Luminal flow rate was affected by ureter shape. The side holes of a double J stent played a different role in detour, which depended on ureter geometry.

  8. The Role of Flow Experience and CAD Tools in Facilitating Creative Behaviours for Architecture Design Students

    Science.gov (United States)

    Dawoud, Husameddin M.; Al-Samarraie, Hosam; Zaqout, Fahed

    2015-01-01

    This study examined the role of flow experience in intellectual activity with an emphasis on the relationship between flow experience and creative behaviour in design using CAD. The study used confluence and psychometric approaches because of their unique abilities to depict a clear image of creative behaviour. A cross-sectional study…

  9. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Sørensen, Karl Grau; Rode, Carsten

    2009-01-01

    cavity such as behind the exterior cladding of a building envelope, i.e. a flow which is parallel to the construction plane. (2) Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the constructionplane. The paper presents the models and how they have...

  10. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  11. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  12. Analysis and specification tools in relation to the APSE

    Science.gov (United States)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  13. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-06-01

    Full Text Available Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  14. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    Science.gov (United States)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  15. Revisiting corpus creation and analysis tools for translation tasks

    Directory of Open Access Journals (Sweden)

    Claudio Fantinuoli

    2016-04-01

    Many translation scholars have proposed the use of corpora to allow professional translators to produce high quality texts which read like originals. Yet, the diffusion of this methodology has been modest, one reason being the fact that software for corpora analyses have been developed with the linguist in mind, which means that they are generally complex and cumbersome, offering many advanced features, but lacking the level of usability and the specific features that meet translators’ needs. To overcome this shortcoming, we have developed TranslatorBank, a free corpus creation and analysis tool designed for translation tasks. TranslatorBank supports the creation of specialized monolingual corpora from the web; it includes a concordancer with a query system similar to a search engine; it uses basic statistical measures to indicate the reliability of results; it accesses the original documents directly for more contextual information; it includes a statistical and linguistic terminology extraction utility to extract the relevant terminology of the domain and the typical collocations of a given term. Designed to be easy and intuitive to use, the tool may help translation students as well as professionals to increase their translation quality by adhering to the specific linguistic variety of the target text corpus.

  16. Flood-flow analysis for Kabul river at Warsak on the basis of flow-records of Kabul river at Nowshera

    International Nuclear Information System (INIS)

    Khan, B.

    2007-01-01

    High flows and stream discharge have long been measured and used by the engineers in the design of hydraulic structures and flood-protection works and in planning for flood-plain use. Probability-analysis is the basis for the engineering design of many projects and advance information about flood-forecasting. High-flow analysis or flood-frequency studies interpret a past record of events, to predict the future probability of occurrence. In many countries, including the author's country, the long term flow data required for design of hydraulic structures and flood-protection works are not available. In such cases, the only tool with hydrologists is to extend the short-term flow data available at some other site in the region. The present study is made to find a reliable estimation of maximum instantaneous flood for higher frequencies of Kabul River at Warsak weir. Kabul River, at Nowshera gaging station is used or the purpose and regression-analysis is performed to extend the instantaneous peak-flow record up to 29 years at Warsak. The frequency-curves of high-flows are plotted on the normal probability paper, using different probability distributions. The Gumbel distribution seemed to be the best fit for the observed data-points, and is used here for estimation of flood for different return periods. (author)

  17. A CLIPS expert system for clinical flow cytometry data analysis

    Science.gov (United States)

    Salzman, G. C.; Duque, R. E.; Braylan, R. C.; Stewart, C. C.

    1990-01-01

    An expert system is being developed using CLIPS to assist clinicians in the analysis of multivariate flow cytometry data from cancer patients. Cluster analysis is used to find subpopulations representing various cell types in multiple datasets each consisting of four to five measurements on each of 5000 cells. CLIPS facts are derived from results of the clustering. CLIPS rules are based on the expertise of Drs. Stewart, Duque, and Braylan. The rules incorporate certainty factors based on case histories.

  18. Analysis of bubbly flow using particle image velocimetry

    Energy Technology Data Exchange (ETDEWEB)

    Todd, D.R.; Ortiz-Villafuerte, J.; Schmidl, W.D.; Hassan, Y.A. [Texas A and M University, Nuclear Engineering Dept., College Stagion, TX (United States); Sanchez-Silva, F. [ESIME, INP (Mexico)

    2001-07-01

    The local phasic velocities can be determined in two-phase flows if the phases can be separated during analysis. The continuous liquid velocity field can be captured using standard Particle Image Velocimetry (PIV) techniques in two-phase flows. PIV is now a well-established, standard flow measurement technique, which provides instantaneous velocity fields in a two-dimensional plane of finite thickness. PIV can be extended to three dimensions within the plane with special considerations. A three-dimensional shadow PIV (SPIV) measurement apparatus can be used to capture the dispersed phase flow parameters such as velocity and interfacial area. The SPIV images contain only the bubble images, and can be easily analyzed and the results used to separate the dispersed phase from the continuous phase in PIV data. An experimental system that combines the traditional PIV technique with SPIV will be described and sample data will be analyzed to demonstrate an advanced turbulence measurement method in a two-phase bubbly flow system. Also, a qualitative error analysis method that allows users to reduce the number of erroneous vectors obtained from the PIV measurements will be discussed. (authors)

  19. Analysis of bubbly flow using particle image velocimetry

    International Nuclear Information System (INIS)

    Todd, D.R.; Ortiz-Villafuerte, J.; Schmidl, W.D.; Hassan, Y.A.; Sanchez-Silva, F.

    2001-01-01

    The local phasic velocities can be determined in two-phase flows if the phases can be separated during analysis. The continuous liquid velocity field can be captured using standard Particle Image Velocimetry (PIV) techniques in two-phase flows. PIV is now a well-established, standard flow measurement technique, which provides instantaneous velocity fields in a two-dimensional plane of finite thickness. PIV can be extended to three dimensions within the plane with special considerations. A three-dimensional shadow PIV (SPIV) measurement apparatus can be used to capture the dispersed phase flow parameters such as velocity and interfacial area. The SPIV images contain only the bubble images, and can be easily analyzed and the results used to separate the dispersed phase from the continuous phase in PIV data. An experimental system that combines the traditional PIV technique with SPIV will be described and sample data will be analyzed to demonstrate an advanced turbulence measurement method in a two-phase bubbly flow system. Also, a qualitative error analysis method that allows users to reduce the number of erroneous vectors obtained from the PIV measurements will be discussed. (authors)

  20. Numerical analysis of flow fields generated by accelerating flames

    Energy Technology Data Exchange (ETDEWEB)

    Kurylo, J.

    1977-12-01

    Presented here is a numerical technique for the analysis of non-steady flow fields generated by accelerating flames in gaseous media. Of particular interest in the study is the evaluation of the non-steady effects on the flow field and the possible transition of the combustion process to detonation caused by an abrupt change in the burning speed of an initially steady flame propagating in an unconfined combustible gas mixture. Optically recorded observations of accelerating flames established that the flow field can be considered to consist of non-steady flow fields associated with an assembly of interacting shock waves, contact discontinuities, deflagration and detonation fronts. In the analysis, these flow fields are treated as spatially one-dimensional, the influence of transport phenomena is considered to be negligible, and unburned and burned substances are assumed to behave as perfect gases with constant, but different, specific heats. The basis of the numerical technique is an explicit, two step, second order accurate, finite difference scheme employed to integrate the flow field equations expressed in divergence form. The burning speed, governing the motion of the deflagration, is expressed in the form of a power law dependence on pressure and temperature immediately ahead of its front. The steady wave solution is obtained by the vector polar interaction technique, that is, by determining the point of intersection between the loci of end states in the plane of the two interaction invariants, pressure and particle velocity. The technique is illustrated by a numerical example in which a steady flame experiences an abrupt change in its burning speed. Solutions correspond either to the eventual reestablishment of a steady state flow field commensurate with the burning speed or to the transition to detonation. The results are in satisfactory agreement with experimental observations.

  1. Performance Analysis of a Fluidic Axial Oscillation Tool for Friction Reduction with the Absence of a Throttling Plate

    Directory of Open Access Journals (Sweden)

    Xinxin Zhang

    2017-04-01

    Full Text Available An axial oscillation tool is proved to be effective in solving problems associated with high friction and torque in the sliding drilling of a complex well. The fluidic axial oscillation tool, based on an output-fed bistable fluidic oscillator, is a type of axial oscillation tool which has become increasingly popular in recent years. The aim of this paper is to analyze the dynamic flow behavior of a fluidic axial oscillation tool with the absence of a throttling plate in order to evaluate its overall performance. In particular, the differences between the original design with a throttling plate and the current default design are profoundly analyzed, and an improvement is expected to be recorded for the latter. A commercial computational fluid dynamics code, Fluent, was used to predict the pressure drop and oscillation frequency of a fluidic axial oscillation tool. The results of the numerical simulations agree well with corresponding experimental results. A sufficient pressure pulse amplitude with a low pressure drop is desired in this study. Therefore, a relative pulse amplitude of pressure drop and displacement are introduced in our study. A comparison analysis between the two designs with and without a throttling plate indicates that when the supply flow rate is relatively low or higher than a certain value, the fluidic axial oscillation tool with a throttling plate exhibits a better performance; otherwise, the fluidic axial oscillation tool without a throttling plate seems to be a preferred alternative. In most of the operating circumstances in terms of the supply flow rate and pressure drop, the fluidic axial oscillation tool performs better than the original design.

  2. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  3. Program ELM: A tool for rapid thermal-hydraulic analysis of solid-core nuclear rocket fuel elements

    International Nuclear Information System (INIS)

    Walton, J.T.

    1992-11-01

    This report reviews the state of the art of thermal-hydraulic analysis codes and presents a new code, Program ELM, for analysis of fuel elements. ELM is a concise computational tool for modeling the steady-state thermal-hydraulics of propellant flow through fuel element coolant channels in a nuclear thermal rocket reactor with axial coolant passages. The program was developed as a tool to swiftly evaluate various heat transfer coefficient and friction factor correlations generated for turbulent pipe flow with heat addition which have been used in previous programs. Thus, a consistent comparison of these correlations was performed, as well as a comparison with data from the NRX reactor experiments from the Nuclear Engine for Rocket Vehicle Applications (NERVA) project. This report describes the ELM Program algorithm, input/output, and validation efforts and provides a listing of the code

  4. Natural funnel asymmetries. A simulation analysis of the three basic tools of meta analysis

    DEFF Research Database (Denmark)

    Callot, Laurent Abdelkader Francois; Paldam, Martin

    Meta-analysis studies a set of estimates of one parameter with three basic tools: The funnel diagram is the distribution of the estimates as a function of their precision; the funnel asymmetry test, FAT; and the meta average, where PET is an estimate. The FAT-PET MRA is a meta regression analysis...

  5. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kwon, Eun Ha; Kim, Ho Dong

    2005-06-15

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered.

  6. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  7. SINEBase: a database and tool for SINE analysis.

    Science.gov (United States)

    Vassetzky, Nikita S; Kramerov, Dmitri A

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis.

  8. Software Tools for Robust Analysis of High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Valentin Todorov

    2014-06-01

    Full Text Available The present work discusses robust multivariate methods specifically designed for highdimensions. Their implementation in R is presented and their application is illustratedon examples. The first group are algorithms for outlier detection, already introducedelsewhere and implemented in other packages. The value added of the new package isthat all methods follow the same design pattern and thus can use the same graphicaland diagnostic tools. The next topic covered is sparse principal components including anobject oriented interface to the standard method proposed by Zou, Hastie, and Tibshirani(2006 and the robust one proposed by Croux, Filzmoser, and Fritz (2013. Robust partialleast squares (see Hubert and Vanden Branden 2003 as well as partial least squares fordiscriminant analysis conclude the scope of the new package.

  9. In silico tools for the analysis of antibiotic biosynthetic pathways

    DEFF Research Database (Denmark)

    Weber, Tilmann

    2014-01-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel...... screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods...... and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data....

  10. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  11. Flow analysis of an innovative compact heat exchanger channel geometry

    International Nuclear Information System (INIS)

    Vitillo, F.; Cachon, L.; Reulet, F.; Millan, P.

    2016-01-01

    Highlights: • An innovative compact heat transfer technology is proposed. • Experimental measurements are shown to validate the CFD model. • CFD simulations show various flow mechanisms. • Flow analysis is performed to study physical phenomena enhancing heat transfer. - Abstract: In the framework of CEA R&D program to develop an industrial prototype of sodium-cooled fast reactor named ASTRID, the present work aims to propose an innovative compact heat exchanger technology to provide solid technological basis for the utilization of a Brayton gas-power conversion system, in order to avoid the energetic sodium–water interaction if a traditional Rankine cycle was used. The aim of the present work is to propose an innovative compact heat exchanger channel geometry to potentially enhance heat transfer in such components. Hence, before studying the innovative channel performance, a solid experimental and numerical database is necessary to perform a preliminary thermal–hydraulic analysis. To do that, two experimental test sections are used: a Laser Doppler Velocimetry (LDV) test section and a Particle Image Velocimetry (PIV) test section. The acquired experimental database is used to validate the Anisotropic Shear Stress Transport (ASST) turbulence model. Results show a good agreement between LDV, PIV and ASST data for the pure aerodynamic flow. Once validated the numerical model, the innovative channel flow analysis is performed. Principal and secondary flow has been analyzed, showing a high swirling flow in the bend region and demonstrating that mixing actually occurs in the mixing zone. This work has to be considered as a step forward the preposition of a reliable high-performance component for application to ASTRID reactor as well as to any other industrial power plant dealing needing compact heat exchangers.

  12. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    International Nuclear Information System (INIS)

    Wingle, W.L.; Poeter, E.P.; McKenna, S.A.

    1999-01-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  13. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    Science.gov (United States)

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  14. Experimental resource pulses influence social-network dynamics and the potential for information flow in tool-using crows.

    Science.gov (United States)

    St Clair, James J H; Burns, Zackory T; Bettaney, Elaine M; Morrissey, Michael B; Otis, Brian; Ryder, Thomas B; Fleischer, Robert C; James, Richard; Rutz, Christian

    2015-11-03

    Social-network dynamics have profound consequences for biological processes such as information flow, but are notoriously difficult to measure in the wild. We used novel transceiver technology to chart association patterns across 19 days in a wild population of the New Caledonian crow--a tool-using species that may socially learn, and culturally accumulate, tool-related information. To examine the causes and consequences of changing network topology, we manipulated the environmental availability of the crows' preferred tool-extracted prey, and simulated, in silico, the diffusion of information across field-recorded time-ordered networks. Here we show that network structure responds quickly to environmental change and that novel information can potentially spread rapidly within multi-family communities, especially when tool-use opportunities are plentiful. At the same time, we report surprisingly limited social contact between neighbouring crow communities. Such scale dependence in information-flow dynamics is likely to influence the evolution and maintenance of material cultures.

  15. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    Directory of Open Access Journals (Sweden)

    Albion Tim

    2005-10-01

    Full Text Available Abstract Background Single Nucleotide Polymorphism (SNP genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS.

  16. T.I.M.S: TaqMan Information Management System, tools to organize data flow in a genotyping laboratory

    Science.gov (United States)

    Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico

    2005-01-01

    Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298

  17. Technical requirements document for the waste flow analysis

    International Nuclear Information System (INIS)

    Shropshire, D.E.

    1996-05-01

    Purpose of this Technical Requirements Document is to define the top level customer requirements for the Waste Flow Analysis task. These requirements, once agreed upon with DOE, will be used to flow down subsequent development requirements to the model specifications. This document is intended to be a ''living document'' which will be modified over the course of the execution of this work element. Initial concurrence with the technical functional requirements from Environmental Management (EM)-50 is needed before the work plan can be developed

  18. Complex analysis with applications to flows and fields

    CERN Document Server

    Braga da Costa Campos, Luis Manuel

    2012-01-01

    Complex Analysis with Applications to Flows and Fields presents the theory of functions of a complex variable, from the complex plane to the calculus of residues to power series to conformal mapping. The book explores numerous physical and engineering applications concerning potential flows, the gravity field, electro- and magnetostatics, steady heat conduction, and other problems. It provides the mathematical results to sufficiently justify the solution of these problems, eliminating the need to consult external references.The book is conveniently divided into four parts. In each part, the ma

  19. CROSS-FLOW ULTRAFILTRATION OF SECONDARY EFFLUENTS. MEMBRANE FOULING ANALYSIS

    Directory of Open Access Journals (Sweden)

    Luisa Vera

    2014-12-01

    Full Text Available The application of cross-flow ultrafiltration to regenerate secondary effluents is limited by membrane fouling. This work analyzes the influence of the main operational parameters (transmembrane pressure and cross-flow velocity about the selectivity and fouling observed in an ultrafiltration tubular ceramic membrane. The experimental results have shown a significant retention of the microcolloidal and soluble organic matter (52 – 54% in the membrane. The fouling analysis has defined the critical operational conditions where the fouling resistance is minimized. Such conditions can be described in terms of a dimensionless number known as shear stress number and its relationship with other dimensionless parameter, the fouling number.

  20. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.