WorldWideScience

Sample records for modeling tools based

  1. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... of the framework. The issue of commercial simulators or software providing the necessary features for product-process synthesis-design as opposed to their development by the academic PSE community will also be discussed. An example of a successful collaboration between academia-industry for the development...

  2. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  3. A Tool for Model-Based Language Specification

    CERN Document Server

    Quesada, Luis; Cubero, Juan-Carlos

    2011-01-01

    Formal languages let us define the textual representation of data with precision. Formal grammars, typically in the form of BNF-like productions, describe the language syntax, which is then annotated for syntax-directed translation and completed with semantic actions. When, apart from the textual representation of data, an explicit representation of the corresponding data structure is required, the language designer has to devise the mapping between the suitable data model and its proper language specification, and then develop the conversion procedure from the parse tree to the data model instance. Unfortunately, whenever the format of the textual representation has to be modified, changes have to propagated throughout the entire language processor tool chain. These updates are time-consuming, tedious, and error-prone. Besides, in case different applications use the same language, several copies of the same language specification have to be maintained. In this paper, we introduce a model-based parser generat...

  4. Development of hydrogeological modelling tools based on NAMMU

    Energy Technology Data Exchange (ETDEWEB)

    Marsic, N. [Kemakta Konsult AB, Stockholm (Sweden); Hartley, L.; Jackson, P.; Poole, M. [AEA Technology, Harwell (United Kingdom); Morvik, A. [Bergen Software Services International AS, Bergen (Norway)

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  5. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  6. MQ-2 A Tool for Prolog-based Model Querying

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald

    2012-01-01

    MQ-2 integrates a Prolog console into the MagicDraw1 modeling environment and equips this console with features targeted specifically to the task of querying models. The vision of MQ-2 is to make Prolog-based model querying accessible to both student and expert modelers by offering powerful query...

  7. Ontology-based tools to expedite predictive model construction.

    Science.gov (United States)

    Haug, Peter; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Ferraro, Jeffrey

    2014-01-01

    Large amounts of medical data are collected electronically during the course of caring for patients using modern medical information systems. This data presents an opportunity to develop clinically useful tools through data mining and observational research studies. However, the work necessary to make sense of this data and to integrate it into a research initiative can require substantial effort from medical experts as well as from experts in medical terminology, data extraction, and data analysis. This slows the process of medical research. To reduce the effort required for the construction of computable, diagnostic predictive models, we have developed a system that hybridizes a medical ontology with a large clinical data warehouse. Here we describe components of this system designed to automate the development of preliminary diagnostic models and to provide visual clues that can assist the researcher in planning for further analysis of the data behind these models.

  8. Interactive model evaluation tool based on IPython notebook

    Science.gov (United States)

    Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet

    2015-04-01

    In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the

  9. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  10. Model-Based Design Tools for Extending COTS Components To Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this project is model-based design (MBD) tools for predicting the performance and useful life of commercial-off-the-shelf (COTS) components and...

  11. Model-Based Design Tools for Extending COTS Components To Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this Phase I project is to prove the feasibility of using model-based design (MBD) tools to predict the performance and useful life of...

  12. Physics-based Modeling Tools for Life Prediction and Durability Assessment of Advanced Materials Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The technical objectives of this program are: (1) to develop a set of physics-based modeling tools to predict the initiation of hot corrosion and to address pit and...

  13. A NUI Based Multiple Perspective Variability Modelling CASE Tool

    OpenAIRE

    Bashroush, Rabih

    2010-01-01

    With current trends towards moving variability from hardware to \\ud software, and given the increasing desire to postpone design decisions as much \\ud as is economically feasible, managing the variability from requirements \\ud elicitation to implementation is becoming a primary business requirement in the \\ud product line engineering process. One of the main challenges in variability \\ud management is the visualization and management of industry size variability \\ud models. In this demonstrat...

  14. CPS Modeling of CNC Machine Tool Work Processes Using an Instruction-Domain Based Approach

    Directory of Open Access Journals (Sweden)

    Jihong Chen

    2015-06-01

    Full Text Available Building cyber-physical system (CPS models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control (CNC system during the work processes of a CNC machine tool is the main source of the big data on which a CPS model is established. In this work-process model, a method based on instruction domain is applied to analyze the electronic big data, and a quantitative description of the numerical control (NC processes is built according to the G code of the processes. Utilizing the instruction domain, a work-process CPS model is established on the basis of the accurate, real-time mapping of the manufacturing tasks, resources, and status of the CNC machine tool. Using such models, case studies are conducted on intelligent-machining applications, such as the optimization of NC processing parameters and the health assurance of CNC machine tools.

  15. ENISI SDE: A New Web-Based Tool for Modeling Stochastic Processes.

    Science.gov (United States)

    Mei, Yongguo; Carbo, Adria; Hoops, Stefan; Hontecillas, Raquel; Bassaganya-Riera, Josep

    2015-01-01

    Modeling and simulations approaches have been widely used in computational biology, mathematics, bioinformatics and engineering to represent complex existing knowledge and to effectively generate novel hypotheses. While deterministic modeling strategies are widely used in computational biology, stochastic modeling techniques are not as popular due to a lack of user-friendly tools. This paper presents ENISI SDE, a novel web-based modeling tool with stochastic differential equations. ENISI SDE provides user-friendly web user interfaces to facilitate adoption by immunologists and computational biologists. This work provides three major contributions: (1) discussion of SDE as a generic approach for stochastic modeling in computational biology; (2) development of ENISI SDE, a web-based user-friendly SDE modeling tool that highly resembles regular ODE-based modeling; (3) applying ENISI SDE modeling tool through a use case for studying stochastic sources of cell heterogeneity in the context of CD4+ T cell differentiation. The CD4+ T cell differential ODE model has been published [8] and can be downloaded from biomodels.net. The case study reproduces a biological phenomenon that is not captured by the previously published ODE model and shows the effectiveness of SDE as a stochastic modeling approach in biology in general and immunology in particular and the power of ENISI SDE.

  16. Force sensor based tool condition monitoring using a heterogeneous ensemble learning model.

    Science.gov (United States)

    Wang, Guofeng; Yang, Yinwei; Li, Zhimeng

    2014-11-14

    Tool condition monitoring (TCM) plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM), hidden Markov model (HMM) and radius basis function (RBF) are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR) algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.

  17. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  18. Model-based calculating tool for pollen-mediated gene flow frequencies in plants.

    Science.gov (United States)

    Lei, Wang; Bao-Rong, Lu

    2016-12-30

    The potential social-economic and environmental impacts caused by transgene flow from genetically engineered (GE) crops have stimulated worldwide biosafety concerns. To determine transgene flow frequencies resulted from pollination is the first critical step for assessing such impacts, in addition to the determination of transgene expression and fitness in crop-wild hybrid descendants. Two methods are commonly used to estimate pollen-mediated gene flow (PMGF) frequencies: field experimenting and mathematical modeling. Field experiments can provide relatively accurate results but are time/resource consuming. Modeling offers an effective complement for PMGF experimental assessment. However, many published models describe PMGF by mathematical equations and are practically not easy to use. To increase the application of PMGF modeling for the estimation of transgene flow, we established a tool to calculate PMGF frequencies based on a quasi-mechanistic PMGF model for wind-pollination species. This tool includes a calculating program displayed by an easy-operating interface. PMGF frequencies of different plant species can be quickly calculated under different environmental conditions by including a number of biological and wind speed parameters that can be measured in the fields/laboratories or obtained from published data. The tool is freely available in the public domain (http://ecology.fudan.edu.cn/userfiles/cn/files/Tool_Manual.zip). Case studies including rice, wheat, and maize demonstrated similar results between the calculated frequencies based on this tool and those from published PMGF data. This PMGF calculating tool will provide useful information for assessing and monitoring social-economic and environmental impacts caused by transgene flow from GE crops. This tool can also be applied to determine the isolation distances between GE and non-GE crops in a coexistence agro-ecosystem, and to ensure the purity of certified seeds by setting proper isolation distances

  19. Multiscale Multiphysics-Based Modeling and Analysis on the Tool Wear in Micro Drilling

    Science.gov (United States)

    Niu, Zhichao; Cheng, Kai

    2016-02-01

    In micro-cutting processes, process variables including cutting force, cutting temperature and drill-workpiece interfacing conditions (lubrication and interaction, etc.) significantly affect the tool wear in a dynamic interactive in-process manner. The resultant tool life and cutting performance directly affect the component surface roughness, material removal rate and form accuracy control, etc. In this paper, a multiscale multiphysics oriented approach to modeling and analysis is presented particularly on tooling performance in micro drilling processes. The process optimization is also taken account based on establishing the intrinsic relationship between process parameters and cutting performance. The modeling and analysis are evaluated and validated through well-designed machining trials, and further supported by metrology measurements and simulations. The paper is concluded with a further discussion on the potential and application of the approach for broad micro manufacturing purposes.

  20. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. I

  1. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs.

  2. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. I

  3. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    Science.gov (United States)

    Gupta, Saurabh; Black-Schaffer, W. Stephen; Crawford, James M.; Gross, David; Karcher, Donald S.; Kaufman, Jill; Knapman, Doug; Prystowsky, Michael B.; Wheeler, Thomas M.; Bean, Sarah; Kumar, Paramhans; Sharma, Raghav; Chamoli, Vaibhav; Ghai, Vikrant; Gogia, Vineet; Weintraub, Sally; Cohen, Michael B.

    2015-01-01

    Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply) of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories), service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models. PMID:28725751

  4. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    Directory of Open Access Journals (Sweden)

    Saurabh Gupta BPharm

    2015-10-01

    Full Text Available Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories, service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models.

  5. Agent-based model of laser hair removal: A treatment optimization and patient education tool

    Directory of Open Access Journals (Sweden)

    Eapen Bell

    2009-01-01

    Full Text Available Background: Tracking of various parameters associated with laser hair removal is tedious and time consuming. The currently available mathematical models are not simple enough for physicians to be used as a treatment optimization and patient education tool. Aim: The aim of the study was to develop a mathematical model for laser hair removal using agent-based modeling and to make a user-friendly simulation environment. Methods: The model was created using NetLogo. The hairs were modeled as agents oscillating between anagen and telogen. The variables were assigned based on published data whenever possible and the various paths the agent could take were coded as conditional statements. The improvement was assessed using an arbitrary index which takes into account the mean diameter and pigmentation along with the number and length of hairs visible above the surface. Few of the commonly encountered scenarios were simulated using the model. Results: The model is made freely available online (http://www.gulfdoctor.net/model/lhr.htm. Limited number of simulations performed indicated that an eight-week gap between laser sessions may be more effective than a four-week gap. Conclusions: The simulation provides a reliable tool for treatment optimization and patient education as obtaining relevant clinical data is slow and labor-intensive. Its visual interface and online availability makes it useful for everyday use.

  6. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  7. Visinets: a web-based pathway modeling and dynamic visualization tool.

    Directory of Open Access Journals (Sweden)

    Jozef Spychala

    Full Text Available In this report we describe a novel graphically oriented method for pathway modeling and a software package that allows for both modeling and visualization of biological networks in a user-friendly format. The Visinets mathematical approach is based on causal mapping (CMAP that has been fully integrated with graphical interface. Such integration allows for fully graphical and interactive process of modeling, from building the network to simulation of the finished model. To test the performance of Visinets software we have applied it to: a create executable EGFR-MAPK pathway model using an intuitive graphical way of modeling based on biological data, and b translate existing ordinary differential equation (ODE based insulin signaling model into CMAP formalism and compare the results. Our testing fully confirmed the potential of the CMAP method for broad application for pathway modeling and visualization and, additionally, showed significant advantage in computational efficiency. Furthermore, we showed that Visinets web-based graphical platform, along with standardized method of pathway analysis, may offer a novel and attractive alternative for dynamic simulation in real time for broader use in biomedical research. Since Visinets uses graphical elements with mathematical formulas hidden from the users, we believe that this tool may be particularly suited for those who are new to pathway modeling and without the in-depth modeling skills often required when using other software packages.

  8. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    Science.gov (United States)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  9. Population Density Modeling Tool

    Science.gov (United States)

    2014-02-05

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...Density Modeling Tool 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Davy Andrew Michael Knott David Burke 5d. PROJECT NUMBER

  10. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    Science.gov (United States)

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  11. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  12. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    Science.gov (United States)

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  13. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    Directory of Open Access Journals (Sweden)

    Thomas E Gorochowski

    Full Text Available Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher.

  14. Designing a new tool for modeling and simulation of discrete-event based systems

    OpenAIRE

    2009-01-01

    This paper talks about design, development, and application of a new Petri net simulator for modeling and simulation of discrete event system (e.g. information systems). The new tool is called GPenSIM (General purpose Petri Net Simulator). Firstly, this paper presents the reason for developing a new tool, through a brief literature study. Secondly, the design and architectural issues of the tool is given. Finally, an application example is given on the application of the tool.

  15. Hybrid Neural Network Approach Based Tool for the Modelling of Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Antonino Laudani

    2015-01-01

    Full Text Available A hybrid neural network approach based tool for identifying the photovoltaic one-diode model is presented. The generalization capabilities of neural networks are used together with the robustness of the reduced form of one-diode model. Indeed, from the studies performed by the authors and the works present in the literature, it was found that a direct computation of the five parameters via multiple inputs and multiple outputs neural network is a very difficult task. The reduced form consists in a series of explicit formulae for the support to the neural network that, in our case, is aimed at predicting just two parameters among the five ones identifying the model: the other three parameters are computed by reduced form. The present hybrid approach is efficient from the computational cost point of view and accurate in the estimation of the five parameters. It constitutes a complete and extremely easy tool suitable to be implemented in a microcontroller based architecture. Validations are made on about 10000 PV panels belonging to the California Energy Commission database.

  16. Reticella: An Execution Trace Slicing and Visualization Tool Based on a Behavior Model

    Science.gov (United States)

    Noda, Kunihiro; Kobayashi, Takashi; Yamamoto, Shinichiro; Saeki, Motoshi; Agusa, Kiyoshi

    Program comprehension using dynamic information is one of key tasks of software maintenance. Software visualization with sequence diagrams is a promising technique to help developer comprehend the behavior of object-oriented systems effectively. There are many tools that can support automatic generation of a sequence diagram from execution traces. However it is still difficult to understand the behavior because the size of automatically generated sequence diagrams from the massive amounts of execution traces tends to be beyond developer's capacity. In this paper, we propose an execution trace slicing and visualization method. Our proposed method is capable of slice calculation based on a behavior model which can treat dependencies based on static and dynamic analysis and supports for various programs including exceptions and multi-threading. We also introduce our tool that perform our proposed slice calculation on the Eclipse platform. We show the applicability of our proposed method by applying the tool to two Java programs as case studies. As a result, we confirm effectiveness of our proposed method for understanding the behavior of object-oriented systems.

  17. Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system.

    Science.gov (United States)

    Brüderle, Daniel; Müller, Eric; Davison, Andrew; Muller, Eilif; Schemmel, Johannes; Meier, Karlheinz

    2009-01-01

    Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated.

  18. Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system

    Directory of Open Access Journals (Sweden)

    Daniel Brüderle

    2009-06-01

    Full Text Available Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated.

  19. Model-based reasoning: using visual tools to reveal student learning.

    Science.gov (United States)

    Luckie, Douglas; Harrison, Scott H; Ebert-May, Diane

    2011-03-01

    Using visual models is common in science and should become more common in classrooms. Our research group has developed and completed studies on the use of a visual modeling tool, the Concept Connector. This modeling tool consists of an online concept mapping Java applet that has automatic scoring functions we refer to as Robograder. The Concept Connector enables students in large introductory science courses to visualize their thinking through online model building. The Concept Connector's flexible scoring system, based on tested grading schemes as well as instructor input, has enabled >1,000 physiology students to build maps of their ideas about plant and animal physiology with the guidance of automatic and immediate online scoring of homework. Criterion concept maps developed by instructors in this project contain numerous expert-generated or "correct" propositions connecting two concept words together with a linking phrase. In this study, holistic algorithms were used to test automated methods of scoring concept maps that might work as well as a human grader.

  20. Multirule Based Diagnostic Approach for the Fog Predictions Using WRF Modelling Tool

    Directory of Open Access Journals (Sweden)

    Swagata Payra

    2014-01-01

    Full Text Available The prediction of fog onset remains difficult despite the progress in numerical weather prediction. It is a complex process and requires adequate representation of the local perturbations in weather prediction models. It mainly depends upon microphysical and mesoscale processes that act within the boundary layer. This study utilizes a multirule based diagnostic (MRD approach using postprocessing of the model simulations for fog predictions. The empiricism involved in this approach is mainly to bridge the gap between mesoscale and microscale variables, which are related to mechanism of the fog formation. Fog occurrence is a common phenomenon during winter season over Delhi, India, with the passage of the western disturbances across northwestern part of the country accompanied with significant amount of moisture. This study implements the above cited approach for the prediction of occurrences of fog and its onset time over Delhi. For this purpose, a high resolution weather research and forecasting (WRF model is used for fog simulations. The study involves depiction of model validation and postprocessing of the model simulations for MRD approach and its subsequent application to fog predictions. Through this approach model identified foggy and nonfoggy days successfully 94% of the time. Further, the onset of fog events is well captured within an accuracy of 30–90 minutes. This study demonstrates that the multirule based postprocessing approach is a useful and highly promising tool in improving the fog predictions.

  1. Fourier transform based dynamic error modeling method for ultra-precision machine tool

    Science.gov (United States)

    Chen, Guoda; Liang, Yingchun; Ehmann, Kornel F.; Sun, Yazhou; Bai, Qingshun

    2014-08-01

    In some industrial fields, the workpiece surface need to meet not only the demand of surface roughness, but the strict requirement of multi-scale frequency domain errors. Ultra-precision machine tool is the most important carrier for the ultra-precision machining of the parts, whose errors is the key factor to influence the multi-scale frequency domain errors of the machined surface. The volumetric error modeling is the important bridge to link the relationship between the machine error and machined surface error. However, the available error modeling method from the previous research is hard to use to analyze the relationship between the dynamic errors of the machine motion components and multi-scale frequency domain errors of the machined surface, which plays the important reference role in the design and accuracy improvement of the ultra-precision machine tool. In this paper, a fourier transform based dynamic error modeling method is presented, which is also on the theoretical basis of rigid body kinematics and homogeneous transformation matrix. A case study is carried out, which shows the proposed method can successfully realize the identical and regular numerical description of the machine dynamic errors and the volumetric errors. The proposed method has strong potential for the prediction of the frequency domain errors on the machined surface, extracting of the information of multi-scale frequency domain errors, and analysis of the relationship between the machine motion components and frequency domain errors of the machined surface.

  2. Structure Based Thermostability Prediction Models for Protein Single Point Mutations with Machine Learning Tools.

    Science.gov (United States)

    Jia, Lei; Yarlagadda, Ramya; Reed, Charles C

    2015-01-01

    Thermostability issue of protein point mutations is a common occurrence in protein engineering. An application which predicts the thermostability of mutants can be helpful for guiding decision making process in protein design via mutagenesis. An in silico point mutation scanning method is frequently used to find "hot spots" in proteins for focused mutagenesis. ProTherm (http://gibk26.bio.kyutech.ac.jp/jouhou/Protherm/protherm.html) is a public database that consists of thousands of protein mutants' experimentally measured thermostability. Two data sets based on two differently measured thermostability properties of protein single point mutations, namely the unfolding free energy change (ddG) and melting temperature change (dTm) were obtained from this database. Folding free energy change calculation from Rosetta, structural information of the point mutations as well as amino acid physical properties were obtained for building thermostability prediction models with informatics modeling tools. Five supervised machine learning methods (support vector machine, random forests, artificial neural network, naïve Bayes classifier, K nearest neighbor) and partial least squares regression are used for building the prediction models. Binary and ternary classifications as well as regression models were built and evaluated. Data set redundancy and balancing, the reverse mutations technique, feature selection, and comparison to other published methods were discussed. Rosetta calculated folding free energy change ranked as the most influential features in all prediction models. Other descriptors also made significant contributions to increasing the accuracy of the prediction models.

  3. Structure Based Thermostability Prediction Models for Protein Single Point Mutations with Machine Learning Tools.

    Directory of Open Access Journals (Sweden)

    Lei Jia

    Full Text Available Thermostability issue of protein point mutations is a common occurrence in protein engineering. An application which predicts the thermostability of mutants can be helpful for guiding decision making process in protein design via mutagenesis. An in silico point mutation scanning method is frequently used to find "hot spots" in proteins for focused mutagenesis. ProTherm (http://gibk26.bio.kyutech.ac.jp/jouhou/Protherm/protherm.html is a public database that consists of thousands of protein mutants' experimentally measured thermostability. Two data sets based on two differently measured thermostability properties of protein single point mutations, namely the unfolding free energy change (ddG and melting temperature change (dTm were obtained from this database. Folding free energy change calculation from Rosetta, structural information of the point mutations as well as amino acid physical properties were obtained for building thermostability prediction models with informatics modeling tools. Five supervised machine learning methods (support vector machine, random forests, artificial neural network, naïve Bayes classifier, K nearest neighbor and partial least squares regression are used for building the prediction models. Binary and ternary classifications as well as regression models were built and evaluated. Data set redundancy and balancing, the reverse mutations technique, feature selection, and comparison to other published methods were discussed. Rosetta calculated folding free energy change ranked as the most influential features in all prediction models. Other descriptors also made significant contributions to increasing the accuracy of the prediction models.

  4. Model-Based Fault Diagnosis Techniques Design Schemes, Algorithms and Tools

    CERN Document Server

    Ding, Steven X

    2013-01-01

    Guaranteeing a high system performance over a wide operating range is an important issue surrounding the design of automatic control systems with successively increasing complexity. As a key technology in the search for a solution, advanced fault detection and identification (FDI) is receiving considerable attention. This book introduces basic model-based FDI schemes, advanced analysis and design algorithms, and mathematical and control-theoretic tools. This second edition of Model-Based Fault Diagnosis Techniques contains: ·         new material on fault isolation and identification, and fault detection in feedback control loops; ·         extended and revised treatment of systematic threshold determination for systems with both deterministic unknown inputs and stochastic noises; addition of the continuously-stirred tank heater as a representative process-industrial benchmark; and ·         enhanced discussion of residual evaluation in stochastic processes. Model-based Fault Diagno...

  5. MatVPC: A User-Friendly MATLAB-Based Tool for the Simulation and Evaluation of Systems Pharmacology Models.

    Science.gov (United States)

    Biliouris, K; Lavielle, M; Trame, M N

    2015-09-01

    Quantitative systems pharmacology (QSP) models are progressively entering the arena of contemporary pharmacology. The efficient implementation and evaluation of complex QSP models necessitates the development of flexible computational tools that are built into QSP mainstream software. To this end, we present MatVPC, a versatile MATLAB-based tool that accommodates QSP models of any complexity level. MatVPC executes Monte Carlo simulations as well as automatic construction of visual predictive checks (VPCs) and quantified VPCs (QVPCs).

  6. Stoffenmanager : a web-based control banding tool using an exposure process model

    NARCIS (Netherlands)

    Marquart, H.; Heussen, H.; Le Feber, M.; Noy, D.; Tielemans, E.; Schinkel, J.; West, J.; Schaaf, D. van der

    2008-01-01

    In the scope of a Dutch programme to reinforce the working conditions policy on hazardous substances, an internet-based tool was developed to help small- and medium-sized companies to handle hazardous substances with more care. The heart of this tool, called the Stoffenmanager, is a risk banding sch

  7. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process.

    Science.gov (United States)

    Araujo, Pablo Granda; Gras, Anna; Ginovart, Marta

    2016-01-01

    Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs) using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics) can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4 (+), NO3 (-), NO2 (-), N2) to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc). MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  8. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process

    Directory of Open Access Journals (Sweden)

    Pablo Araujo Granda

    2016-01-01

    Full Text Available Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4+, NO3−, NO2−, N2 to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc. MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  9. Thermal Error Modeling of the CNC Machine Tool Based on Data Fusion Method of Kalman Filter

    Directory of Open Access Journals (Sweden)

    Haitong Wang

    2017-01-01

    Full Text Available This paper presents a modeling methodology for the thermal error of machine tool. The temperatures predicted by modified lumped-mass method and the temperatures measured by sensors are fused by the data fusion method of Kalman filter. The fused temperatures, instead of the measured temperatures used in traditional methods, are applied to predict the thermal error. The genetic algorithm is implemented to optimize the parameters in modified lumped-mass method and the covariances in Kalman filter. The simulations indicate that the proposed method performs much better compared with the traditional method of MRA, in terms of prediction accuracy and robustness under a variety of operating conditions. A compensation system is developed based on the controlling system of Siemens 840D. Validated by the compensation experiment, the thermal error after compensation has been reduced dramatically.

  10. Development of class model based on blood biochemical parameters as a diagnostic tool of PSE meat.

    Science.gov (United States)

    Qu, Daofeng; Zhou, Xu; Yang, Feng; Tian, Shiyi; Zhang, Xiaojun; Ma, Lin; Han, Jianzhong

    2017-06-01

    A fast, sensitive and effective method based on the blood biochemical parameters for the detection of PSE meat was developed in this study. A total of 200 pigs were slaughtered in the same slaughterhouse. Meat quality was evaluated by measuring pH, electrical conductivity and color at 45min, 2h and 24h after slaughtering in M. longissimus thoracis et lumborum (LD). Blood biochemical parameters were determined in blood samples collected during carcass bleeding. Principal component analysis (PCA) biplot showed that high levels of exsanguination Creatine Kinase, Lactate Dehydrogenase, Aspertate aminotransferase, blood glucose and lactate were associated with the PSE meat, and the five biochemical parameters were found to be good indicators of PSE meat Discriminant function analysis (DFA) was able to clearly identify PSE meat using the five biochemical parameters as input data, and the class model is an effective diagnostic tool in pigs which can be used to detect the PSE meat and reduce economic loss for the company.

  11. Universal geometric error modeling of the CNC machine tools based on the screw theory

    Science.gov (United States)

    Tian, Wenjie; He, Baiyan; Huang, Tian

    2011-05-01

    The methods to improve the precision of the CNC (Computerized Numerical Control) machine tools can be classified into two categories: error prevention and error compensation. Error prevention is to improve the precision via high accuracy in manufacturing and assembly. Error compensation is to analyze the source errors that affect on the machining error, to establish the error model and to reach the ideal position and orientation by modifying the trajectory in real time. Error modeling is the key to compensation, so the error modeling method is of great significance. Many researchers have focused on this topic, and proposed many methods, but we can hardly describe the 6-dimensional configuration error of the machine tools. In this paper, the universal geometric error model of CNC machine tools is obtained utilizing screw theory. The 6-dimensional error vector is expressed with a twist, and the error vector transforms between different frames with the adjoint transformation matrix. This model can describe the overall position and orientation errors of the tool relative to the workpiece entirely. It provides the mathematic model for compensation, and also provides a guideline in the manufacture, assembly and precision synthesis of the machine tools.

  12. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    Science.gov (United States)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  13. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  14. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data

    Directory of Open Access Journals (Sweden)

    Paulino José García Nieto

    2016-01-01

    Full Text Available Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC in combination with multivariate adaptive regression splines (MARS technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC–MARS-based model was successfully used here to predict the milling tool flank wear (output variable as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC–MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed.

  15. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Numerous design decisions are made while developing software systems, which influence the architecture of these systems as well as following decisions. A number of decision management tools already exist for capturing, documenting, and maintaining design decisions, but also for guiding developers...... the development process. In this report, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from a case study: the modeling tool shall show all decisions related to a model and allow its users to extend or update them; the decision management tool shall...... trigger the modeling tool to realize design decisions in the models. We define tool-independent concepts and architecture building blocks supporting these use cases and present how they can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless...

  16. Data Provenance as a Tool for Debugging Hydrological Models based on Python

    Science.gov (United States)

    Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.

    2012-12-01

    There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R

  17. Bio-AIMS Collection of Chemoinformatics Web Tools based on Molecular Graph Information and Artificial Intelligence Models.

    Science.gov (United States)

    Munteanu, Cristian R; Gonzalez-Diaz, Humberto; Garcia, Rafael; Loza, Mabel; Pazos, Alejandro

    2015-01-01

    The molecular information encoding into molecular descriptors is the first step into in silico Chemoinformatics methods in Drug Design. The Machine Learning methods are a complex solution to find prediction models for specific biological properties of molecules. These models connect the molecular structure information such as atom connectivity (molecular graphs) or physical-chemical properties of an atom/group of atoms to the molecular activity (Quantitative Structure - Activity Relationship, QSAR). Due to the complexity of the proteins, the prediction of their activity is a complicated task and the interpretation of the models is more difficult. The current review presents a series of 11 prediction models for proteins, implemented as free Web tools on an Artificial Intelligence Model Server in Biosciences, Bio-AIMS (http://bio-aims.udc.es/TargetPred.php). Six tools predict protein activity, two models evaluate drug - protein target interactions and the other three calculate protein - protein interactions. The input information is based on the protein 3D structure for nine models, 1D peptide amino acid sequence for three tools and drug SMILES formulas for two servers. The molecular graph descriptor-based Machine Learning models could be useful tools for in silico screening of new peptides/proteins as future drug targets for specific treatments.

  18. A Practitioner Model of the Use of Computer-Based Tools and Resources to Support Mathematics Teaching and Learning.

    Science.gov (United States)

    Ruthven, Kenneth; Hennessy, Sara

    2002-01-01

    Analyzes the pedagogical ideas underpinning teachers' accounts of the successful use of computer-based tools and resources to support the teaching and learning of mathematics. Organizes central themes to form a pedagogical model capable of informing the use of such technologies in classroom teaching and generating theoretical conjectures for…

  19. Graphical Modeling Language Tool

    NARCIS (Netherlands)

    Rumnit, M.

    2003-01-01

    The group of the faculty EE-Math-CS of the University of Twente is developing a graphical modeling language for specifying concurrency in software design. This graphical modeling language has a mathematical background based on the theorie of CSP. This language contains the power to create trustworth

  20. Model-based development of a course of action scheduling tool

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Mechlenborg, Peter; Zhang, Lin;

    2008-01-01

    . The scheduling capabilities of COAST are based on state space exploration of the embedded CPN model. Planners interact with COAST using a domain-specific graphical user interface (GUI) that hides the embedded CPN model and analysis algorithms. This means that COAST is based on a rigorous semantical model......, but the use of formal methods is transparent to the users. Trials of operational planning using COAST have been conducted within the Australian Defence Force....

  1. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  2. A New Browser-based, Ontology-driven Tool for Generating Standardized, Deep Descriptions of Geoscience Models

    Science.gov (United States)

    Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.

    2016-12-01

    Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts

  3. GIS-based data model and tools for creating and managing two-dimensional cross sections

    Science.gov (United States)

    Whiteaker, Timothy L.; Jones, Norm; Strassberg, Gil; Lemon, Alan; Gallup, Doug

    2012-02-01

    While modern Geographic Information Systems (GIS) software is robust in handling maps and data in plan view, the software generally falls short when representing features in section view. Further complicating the issue is the fact that geologic cross sections are often drawn by connecting a series of wells together that do not fall along a single straight line. In this case, the x-axis of the cross section represents the distance along the set of individual lines connecting the series of wells, effectively "flattening out" the cross section along this path to create a view of the subsurface with which geologists often work in printed folios. Even 3D-enabled GIS cannot handle this type of cross section. A GIS data model and tools for creating and working with two-dimensional cross sections are presented. The data model and tools create a framework that can be applied using ESRI's ArcGIS software, enabling users to create, edit, manage, and print two-dimensional cross sections from within one of the most well-known GIS software packages. The data model is a component of the arc hydro groundwater data model, which means all two-dimensional cross sections are inherently linked to other features in the hydrogeologic domain, including those represented by xyz coordinates in real world space. Thus, the creation of two-dimensional cross sections can be guided by or completely driven from standard GIS data, and geologic interpretations established on two-dimensional cross sections can be translated back to real world coordinates to create three-dimensional features such as fence diagrams, giving GIS users the capacity to characterize the subsurface environment in a variety of integrated views that was not possible before. A case study for the Sacramento Regional Model in California demonstrates the application of the methodology in support of a regional groundwater management plan.

  4. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  5. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry.

    Science.gov (United States)

    Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P

    2008-11-30

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.

  6. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    Science.gov (United States)

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  7. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ames, David E.,

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  8. Analysis of Utility and Use of a Web-Based Tool for Digital Signal Processing Teaching by Means of a Technological Acceptance Model

    Science.gov (United States)

    Toral, S. L.; Barrero, F.; Martinez-Torres, M. R.

    2007-01-01

    This paper presents an exploratory study about the development of a structural and measurement model for the technological acceptance (TAM) of a web-based educational tool. The aim consists of measuring not only the use of this tool, but also the external variables with a significant influence in its use for planning future improvements. The tool,…

  9. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    Science.gov (United States)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is

  10. Model-free stochastic processes studied with q-wavelet-based informational tools

    Energy Technology Data Exchange (ETDEWEB)

    Perez, D.G. [Instituto de Fisica, Pontificia Universidad Catolica de Valparaiso (PUCV), 23-40025 Valparaiso (Chile)]. E-mail: dario.perez@ucv.cl; Zunino, L. [Centro de Investigaciones Opticas, C.C. 124 Correo Central, 1900 La Plata (Argentina) and Departamento de Ciencias Basicas, Facultad de Ingenieria, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina) and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina)]. E-mail: lucianoz@ciop.unlp.edu.ar; Martin, M.T. [Instituto de Fisica (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata and Argentina' s National Council (CONICET), C.C. 727, 1900 La Plata (Argentina)]. E-mail: mtmartin@venus.unlp.edu.ar; Garavaglia, M. [Centro de Investigaciones Opticas, C.C. 124 Correo Central, 1900 La Plata (Argentina) and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina)]. E-mail: garavagliam@ciop.unlp.edu.ar; Plastino, A. [Instituto de Fisica (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata and Argentina' s National Council (CONICET), C.C. 727, 1900 La Plata (Argentina)]. E-mail: plastino@venus.unlp.edu.ar; Rosso, O.A. [Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina)]. E-mail: oarosso@fibertel.com.ar

    2007-04-30

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework.

  11. Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model

    Science.gov (United States)

    Morin, Mario A.; Ficarazzo, Francesco

    2006-04-01

    Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.

  12. Commissioning of a Geant4 based treatment plan simulation tool: linac model and dicom-rt interface

    CERN Document Server

    Cornelius, Iwan; Middlebrook, Nigel; Poole, Christopher; Oborn, Brad; Langton, Christian

    2011-01-01

    A Geant4 based simulation tool has been developed to perform Monte Carlo modelling of a 6 MV VarianTM iX clinac. The computer aided design interface of Geant4 was used to accurately model the LINAC components, including the Millenium multi-leaf collimators (MLCs). The simulation tool was verified via simulation of standard commissioning dosimetry data acquired with an ionisation chamber in a water phantom. Verification of the MLC model was achieved by simulation of leaf leakage measurements performed using GafchromicTM film in a solid water phantom. An absolute dose calibration capability was added by including a virtual monitor chamber into the simulation. Furthermore, a DICOM-RT interface was integrated with the application to allow the simulation of treatment plans in radiotherapy. The ability of the simulation tool to accurately model leaf movements and doses at each control point was verified by simulation of a widely used intensity-modulated radiation therapy (IMRT) quality assurance (QA) technique, the...

  13. An integrated suite of modeling tools that empower scientists in structure- and property-based drug design

    Science.gov (United States)

    Feng, Jianwen A.; Aliagas, Ignacio; Bergeron, Philippe; Blaney, Jeff M.; Bradley, Erin K.; Koehler, Michael F. T.; Lee, Man-Ling; Ortwine, Daniel F.; Tsui, Vickie; Wu, Johnny; Gobbi, Alberto

    2015-06-01

    Structure- and property-based drug design is an integral part of modern drug discovery, enabling the design of compounds aimed at improving potency and selectivity. However, building molecules using desktop modeling tools can easily lead to poor designs that appear to form many favorable interactions with the protein's active site. Although a proposed molecule looks good on screen and appears to fit into the protein site X-ray crystal structure or pharmacophore model, doing so might require a high-energy small molecule conformation, which would likely be inactive. To help scientists make better design decisions, we have built integrated, easy-to-use, interactive software tools to perform docking experiments, de novo design, shape and pharmacophore based database searches, small molecule conformational analysis and molecular property calculations. Using a combination of these tools helps scientists in assessing the likelihood that a designed molecule will be active and have desirable drug metabolism and pharmacokinetic properties. Small molecule discovery success requires project teams to rapidly design and synthesize potent molecules with good ADME properties. Empowering scientists to evaluate ideas quickly and make better design decisions with easy-to-access and easy-to-understand software on their desktop is now a key part of our discovery process.

  14. Cagen:. a Modern, PC Based Computer Modeling Tool for Explosive MCG Generators and Attached Loads

    Science.gov (United States)

    Chase, J. B.; Chato, D.; Peterson, G.; Pincosy, P.; Kiuttu, G. F.

    2004-11-01

    We will describe the PC based computer program CAGEN. CAGEN models the performance of many varieties of Magneto-Cumulative-Generators (MCG) or Magnetic Flux Compression Generators (FCG) that are energized with High Explosive (HE). CAGEN models helical wound or coaxial types, which have HE on the interior. Any materials and any HE types may be used. The cylindrical radius of the windings (or outer conductor) and the radius of the armature may vary with axial position. Variable winding width, thickness, and pitch can be represented, and divided windings are allowed. The MHD equations are used to advance the diffusion of magnetic field into the conductors in order to compute resistance, melting, and contact effects. Magnetic pressure effects are included. The MCG model is treated as part of a lumped circuit, which includes the priming circuit, an opening fuse switch, an inline storage inductance, a transformer or a voltage dividing fuse, peaking-circuit, and several interesting load models. A typical problem will complete in a few seconds to a few minutes. Graphical input, run control, and analysis of results is provided by MathGraf, which is a CARE'N CO. application.

  15. Left ventricular modelling: a quantitative functional assessment tool based on cardiac magnetic resonance imaging

    Science.gov (United States)

    Conti, C. A.; Votta, E.; Corsi, C.; De Marchi, D.; Tarroni, G.; Stevanella, M.; Lombardi, M.; Parodi, O.; Caiani, E. G.; Redaelli, A.

    2011-01-01

    We present the development and testing of a semi-automated tool to support the diagnosis of left ventricle (LV) dysfunctions from cardiac magnetic resonance (CMR). CMR short-axis images of the LVs were obtained in 15 patients and processed to detect endocardial and epicardial contours and compute volume, mass and regional wall motion (WM). Results were compared with those obtained from manual tracing by an expert cardiologist. Nearest neighbour tracking and finite-element theory were merged to calculate local myocardial strains and torsion. The method was tested on a virtual phantom, on a healthy LV and on two ischaemic LVs with different severity of the pathology. Automated analysis of CMR data was feasible in 13/15 patients: computed LV volumes and wall mass correlated well with manually extracted data. The detection of regional WM abnormalities showed good sensitivity (77.8%), specificity (85.1%) and accuracy (82%). On the virtual phantom, computed local strains differed by less than 14 per cent from the results of commercial finite-element solver. Strain calculation on the healthy LV showed uniform and synchronized circumferential strains, with peak shortening of about 20 per cent at end systole, progressively higher systolic wall thickening going from base to apex, and a 10° torsion. In the two pathological LVs, synchronicity and homogeneity were partially lost, anomalies being more evident for the more severely injured LV. Moreover, LV torsion was dramatically reduced. Preliminary testing confirmed the validity of our approach, which allowed for the fast analysis of LV function, even though future improvements are possible. PMID:22670208

  16. Community capacity for sustainable community-based dengue prevention and control:domain, assessment tool and capacity building model

    Institute of Scientific and Technical Information of China (English)

    Charuai Suwanbamrung

    2010-01-01

    In order to understand the community capacity for sustainable community-based dengue prevention and control, this paper proposes the approach of a previous study about meaning and domains of dengue prevention and control, an assessment tool and a community capacity building model for sustainable community-based dengue prevention and control in the Southern Thailand. A study of dengue community capacity domains was conducted by utilizing a qualitative method, whereby ten initial community domains were identified by means of a literature review, in-depth interviews of sixty community leaders, and eight focus group discussions with sixty non-leaders in four sub-districts of southern Thailand. In the final study, there were 14 identifiable domains in leaders group and 11 domains in non-leaders. The resulting dengue community capacity-assessment tool (DCCAT) consisted of two parts:one for leaders (DCCAT-L) and the other for non-leaders (DCCAT-NL). DCCAT-L was composed of 115 items within 14 domains and 83 items within 11 domains for the DCCAT-NL. The key domains of leaders and non-leaders had a partial overlap of domains such as critical situation management, personal leadership, health care provider capacity, needs assessment, senses of community, leader group networking, communication of dengue information, community leadership, religious capacity, leader group and community networking, resource mobilization, dengue working group, community participation, and continuing activities. The application of the new tool consisted of five steps:1) community preparation, 2) assessment, 3) a community hearing meeting, 4) interventions, and 5) conclusion and improvement step. All stakeholders in the community should use the new tool based on a clear understanding of the measurement objectives, the desired outcomes, resources available and characteristics of their community. If communities need to develop and build dengue community capacity, then the designed pre

  17. Simulation of Metal Flow During Friction Stir Welding Based on the Model of Interactive Force Between Tool and Material

    Science.gov (United States)

    Chen, G. Q.; Shi, Q. Y.; Fujiya, Y.; Horie, T.

    2014-04-01

    In this research, the three-dimensional flow of metal in friction stir welding (FSW) has been simulated based on computational fluid dynamics. Conservation equations of mass, momentum, and energy were solved in three dimensions. The interactive force was imposed as boundary conditions on the tool/material boundary in the model. The strain rate- and temperature-dependent non-Newtonian viscosity was adopted for the calculation of metal flow. The distribution of temperature, velocity, and strain rate were simulated based on the above models. The simulated temperature distribution agreed well with the experimental results. The simulation results showed that the velocity on the pin was much higher than that on the shoulder. From the comparison between the simulation results and the experiments results, contours line, corresponding to strain rate = 4 s-1, reflected reasonably well the shape of stir zone, especially at the ground portion.

  18. A new tool for modeling dune field evolution based on an accessible, GUI version of the Werner dune model

    Science.gov (United States)

    Barchyn, Thomas E.; Hugenholtz, Chris H.

    2012-02-01

    Research into aeolian dune form and dynamics has benefited from simple and abstract cellular automata computer models. Many of these models are based upon a seminal framework proposed by Werner (1995). Unfortunately, most versions of this model are not publicly available or are not provided in a format that promotes widespread use. In our view, this hinders progress in linking model simulations to empirical data (and vice versa). To this end, we introduce an accessible, graphical user interface (GUI) version of the Werner model. The novelty of this contribution is that it provides a simple interface and detailed instructions that encourage widespread use and extension of the Werner dune model for research and training purposes. By lowering barriers for researchers to develop and test hypotheses about aeolian dune and dune field patterns, this release addresses recent calls to improve access to earth surface models.

  19. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1998-01-01

    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  20. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. A Turbine Based Combined Cycle Engine Inlet Model and Mode Transition Simulation Based on HiTECC Tool

    Science.gov (United States)

    Csank, Jeffrey T.; Stueber, Thomas J.

    2012-01-01

    An inlet system is being tested to evaluate methodologies for a turbine based combined cycle propulsion system to perform a controlled inlet mode transition. Prior to wind tunnel based hardware testing of controlled mode transitions, simulation models are used to test, debug, and validate potential control algorithms. One candidate simulation package for this purpose is the High Mach Transient Engine Cycle Code (HiTECC). The HiTECC simulation package models the inlet system, propulsion systems, thermal energy, geometry, nozzle, and fuel systems. This paper discusses the modification and redesign of the simulation package and control system to represent the NASA large-scale inlet model for Combined Cycle Engine mode transition studies, mounted in NASA Glenn s 10- by 10-Foot Supersonic Wind Tunnel. This model will be used for designing and testing candidate control algorithms before implementation.

  2. GIS-Based Analytical Tools for Transport Planning: Spatial Regression Models for Transportation Demand Forecast

    Directory of Open Access Journals (Sweden)

    Simone Becker Lopes

    2014-04-01

    Full Text Available Considering the importance of spatial issues in transport planning, the main objective of this study was to analyze the results obtained from different approaches of spatial regression models. In the case of spatial autocorrelation, spatial dependence patterns should be incorporated in the models, since that dependence may affect the predictive power of these models. The results obtained with the spatial regression models were also compared with the results of a multiple linear regression model that is typically used in trips generation estimations. The findings support the hypothesis that the inclusion of spatial effects in regression models is important, since the best results were obtained with alternative models (spatial regression models or the ones with spatial variables included. This was observed in a case study carried out in the city of Porto Alegre, in the state of Rio Grande do Sul, Brazil, in the stages of specification and calibration of the models, with two distinct datasets.

  3. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  4. Modeling of tool-tissue interactions for computer-based surgical simulation: a literature review

    NARCIS (Netherlands)

    Misra, Sarthak; Ramesh, K.T.; Okamura, Allison M.

    2008-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in th

  5. Participatory medicine: model based tools for engaging and empowering the individual.

    Science.gov (United States)

    Sagar, Mark; Broadbent, Elizabeth

    2016-04-06

    The long-term goal of the Virtual Physiological Human and Digital Patient projects is to run 'simulations' of health and disease processes on the virtual or 'digital' patient, and use the results to make predictions about real health and determine the best treatment specifically for an individual. This is termed 'personalized medicine', and is intended to be the future of healthcare. How will people interact and engage with their virtual selves, and how can virtual models be used to motivate people to actively participate in their own healthcare? We discuss these questions, and describe our current efforts to integrate and realistically embody psychobiological models of face-to-face interaction to enliven and increase engagement of virtual humans in healthcare. Overall, this paper highlights the need for attention to the design of human-machine interfaces to address patient engagement in healthcare.

  6. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    Science.gov (United States)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  7. The South Florida Ecosystem Portfolio Model - A Map-Based Multicriteria Ecological, Economic, and Community Land-Use Planning Tool

    Science.gov (United States)

    Labiosa, William B.; Bernknopf, Richard; Hearn, Paul; Hogan, Dianna; Strong, David; Pearlstine, Leonard; Mathie, Amy M.; Wein, Anne M.; Gillen, Kevin; Wachter, Susan

    2009-01-01

    The South Florida Ecosystem Portfolio Model (EPM) prototype is a regional land-use planning Web tool that integrates ecological, economic, and social information and values of relevance to decision-makers and stakeholders. The EPM uses a multicriteria evaluation framework that builds on geographic information system-based (GIS) analysis and spatially-explicit models that characterize important ecological, economic, and societal endpoints and consequences that are sensitive to regional land-use/land-cover (LULC) change. The EPM uses both economics (monetized) and multiattribute utility (nonmonetized) approaches to valuing these endpoints and consequences. This hybrid approach represents a methodological middle ground between rigorous economic and ecological/ environmental scientific approaches. The EPM sacrifices some degree of economic- and ecological-forecasting precision to gain methodological transparency, spatial explicitness, and transferability, while maintaining credibility. After all, even small steps in the direction of including ecosystem services evaluation are an improvement over current land-use planning practice (Boyd and Wainger, 2003). There are many participants involved in land-use decision-making in South Florida, including local, regional, State, and Federal agencies, developers, environmental groups, agricultural groups, and other stakeholders (South Florida Regional Planning Council, 2003, 2004). The EPM's multicriteria evaluation framework is designed to cut across the objectives and knowledge bases of all of these participants. This approach places fundamental importance on social equity and stakeholder participation in land-use decision-making, but makes no attempt to determine normative socially 'optimal' land-use plans. The EPM is thus a map-based set of evaluation tools for planners and stakeholders to use in their deliberations of what is 'best', considering a balancing of disparate interests within a regional perspective. Although

  8. SITDEM: A simulation tool for disease/endpoint models of association studies based on single nucleotide polymorphism genotypes

    Science.gov (United States)

    Oh, Jung Hun; Deasy, Joseph O.

    2016-01-01

    The association analysis between single nucleotide polymorphisms (SNPs) and disease or endpoint in genome-wide association studies (GWAS) has been considered as a powerful strategy for investigating genetic susceptibility and for identifying significant biomarkers. The statistical analysis approaches with simulated data have been widely used to review experimental designs and performance measurements. In recent years, a number of authors have proposed methods for the simulation of biological data in the genomic field. However, these methods use large-scale genomic data as a reference to simulate experiments, which may limit the use of the methods in the case where the data in specific studies are not available. Few methods use experimental results or observed parameters for simulation. The goal of this study is to develop a Web application called SITDEM to simulate disease/endpoint models in three different approaches based on only parameters observed in GWAS. In our simulation, a key task is to compute the probability of genotypes. Based on that, we randomly sample simulation data. Simulation results are shown as a function of p-value against odds ratio or relative risk of a SNP in dominant and recessive models. Our simulation results show the potential of SITDEM for simulating genotype data. SITDEM could be particularly useful for investigating the relationship among observed parameters for target SNPs and for estimating the number of variables (SNPs) required to result in significant p-values in multiple comparisons. The proposed simulation tool is freely available at http://www.snpmodel.com. PMID:24480173

  9. Rapid Tooling Technique Based on Stereolithograph Prototype

    Institute of Scientific and Technical Information of China (English)

    丁浩; 狄平; 顾伟生; 朱世根

    2001-01-01

    Rapid tooling technique based on the sterelithograph prototype is investigated. The epoxy tooling technological process was elucidated. It is analyzed in detail that the epoxy resin formula is easy to cast, curing process, and release agents. The transitional plaster model is also proposed. The mold to encrust mutual.inductors with epoxy and mold to inject plastic soapboxes was made with the technique The tooling needs very little time and cost, for the process is only to achieve the nice replica of the prototype. It is benefit for the trial and small batch of production.

  10. Application of flood risk modelling in a web-based geospatial decision support tool for coastal adaptation to climate change

    Directory of Open Access Journals (Sweden)

    P. J. Knight

    2015-02-01

    Full Text Available A pressing problem facing coastal decision makers is the conversion of "high level" but plausible climate change assessments into an effective basis for climate change adaptation at the local scale. Here, we describe a web-based, geospatial decision-support tool (DST that provides an assessment of the potential flood risk for populated coastal lowlands arising from future sea-level rise, coastal storms and high river flows. This DST has been developed to support operational and strategic decision making by enabling the user to explore the flood hazard from extreme events, changes in the extent of the flood-prone areas with sea-level rise, and thresholds of sea-level rise where current policy and resource options are no longer viable. The DST is built in an open source GIS that uses freely available geospatial data. Flood risk assessments from a combination of LISFLOOD-FP and SWAB models are embedded within the tool; the user interface enables interrogation of different combinations of coastal and river events under rising sea-level scenarios. Users can readily vary the input parameters (sea level, storms, wave height and river flow relative to the present-day topography and infrastructure to identify combinations where significant regime shifts or "tipping points" occur. Two case studies are used to demonstrate the attributes of the DST with respect to the wider coastal community and the UK energy sector. Examples report on the assets at risk and illustrate the extent of flooding in relation to infrastructure access. This informs an economic assessment of potential losses due to climate change and thus provides local authorities and energy operators with essential information on the feasibility of investment for building resilience into vulnerable components of their area of responsibility.

  11. TOOL FORCE MODEL FOR DIAMOND TURNING

    Institute of Scientific and Technical Information of China (English)

    Wang Hongxiang; Sun Tao; Li Dan; Dong Shen

    2004-01-01

    A new tool force model to be presented is based upon process geometry and the characteristics of the force system,in which the forces acting on the tool rake face,the cutting edge rounding and the clearance face have been considered,and the size effect is accountable for the new model.It is desired that the model can be well applicable to conventional diamond turning and the model may be employed as a tool in the design of diamond tools.This approach is quite different from traditional investigations primarily based on empirical studies.As the depth of cut becomes the same order as the rounded cutting edge radius,sliding along the clearance face due to elastic recovery of workpiece material and plowing due to the rounded cutting edge may become important in micro-machining,the forces acting on the cutting edge rounding and the clearance face can not be neglected.For this reason,it is very important to understand the influence of some parameters on tool forces and develop a model of the relationship between them.

  12. Application of a flexible lattice Boltzmann method based simulation tool for modelling physico-chemical processes at different scales

    Science.gov (United States)

    Patel, Ravi A.; Perko, Janez; Jacques, Diederik

    2017-04-01

    Often, especially in the disciplines related to natural porous media, such as for example vadoze zone or aquifer hydrology or contaminant transport, the relevant spatial and temporal scales on which we need to provide information is larger than the scale where the processes actually occur. Usual techniques used to deal with these problems assume the existence of a REV. However, in order to understand the behavior on larger scales it is important to downscale the problem onto the relevant scale of the processes. Due to the limitations of resources (time, memory) the downscaling can only be made up to the certain lower scale. At this lower scale still several scales may co-exist - the scale which can be explicitly described and a scale which needs to be conceptualized by effective properties. Hence, models which are supposed to provide effective properties on relevant scales should therefor be flexible enough to represent complex pore-structure by explicit geometry on one side, and differently defined processes (e.g. by the effective properties) which emerge on lower scale. In this work we present the state-of-the-art lattice Boltzmann method based simulation tool applicable to advection-diffusion equation coupled to geochemical processes. The lattice Boltzmann transport solver can be coupled with an external geochemical solver which allows to account for a wide range of geochemical reaction networks through thermodynamic databases. The applicability to multiphase systems is ongoing. We provide several examples related to the calculation of an effective diffusion properties, permeability and effective reaction rate based on a continuum scale based on the pore scale geometry.

  13. Development and implementation of a GIS-based tool for spatial modeling of seismic vulnerability of Tehran

    Directory of Open Access Journals (Sweden)

    M. Hashemi

    2012-12-01

    Full Text Available Achieving sustainable development in countries prone to earthquakes is possible with taking effective measures to reduce vulnerability to earthquakes. In this context, damage assessment of hypothetical earthquakes and planning for disaster management are important issues. Having a computer tool capable of estimating structural and human losses from earthquakes in a specific region may facilitate the decision-making process before and during disasters. Interoperability of this tool with wide-spread spatial analysis frameworks will expedite the data transferring process. In this study, the earthquake damage assessment (EDA software tool is developed as an embedded extension within a GIS (geographic information system environment for the city of Tehran, Iran. This GIS-based extension provides users with a familiar environment to estimate and observe the probable damages and fatalities of a deterministic earthquake scenario. The productivity of this tool is later demonstrated for southern Karoon parish, Region 10, Tehran. Three case studies for three active faults in the area and a comparison of the results with other research substantiated the reliability of this tool for additional earthquake scenarios.

  14. simuwatt - A Tablet Based Electronic Auditing Tool

    Energy Technology Data Exchange (ETDEWEB)

    Macumber, Daniel; Parker, Andrew; Lisell, Lars; Metzger, Ian; Brown, Matthew

    2014-05-08

    'simuwatt Energy Auditor' (TM) is a new tablet-based electronic auditing tool that is designed to dramatically reduce the time and cost to perform investment-grade audits and improve quality and consistency. The tool uses the U.S. Department of Energy's OpenStudio modeling platform and integrated Building Component Library to automate modeling and analysis. simuwatt's software-guided workflow helps users gather required data, and provides the data in a standard electronic format that is automatically converted to a baseline OpenStudio model for energy analysis. The baseline energy model is calibrated against actual monthly energy use to ASHRAE Standard 14 guidelines. Energy conservation measures from the Building Component Library are then evaluated using OpenStudio's parametric analysis capability. Automated reporting creates audit documents that describe recommended packages of energy conservation measures. The development of this tool was partially funded by the U.S. Department of Defense's Environmental Security Technology Certification Program. As part of this program, the tool is being tested at 13 buildings on 5 Department of Defense sites across the United States. Results of the first simuwatt audit tool demonstration are presented in this paper.

  15. Modeling Tool Advances Rotorcraft Design

    Science.gov (United States)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  16. Folder: a MATLAB-based tool for modelling deformation in layered media subject to layer parallel shortening or extension

    Science.gov (United States)

    Adamuszek, Marta; Dabrowski, Marcin; Schmid, Daniel W.

    2016-04-01

    We present Folder, a numerical tool to simulate and analyse the structure development in mechanically layered media during the layer parallel shortening or extension. Folder includes a graphical user interface that allows for easy designing of complex geometrical models, defining material parameters (including linear and non-linear rheology), and specifying type and amount of deformation. It also includes a range of features that facilitate the visualization and examination of various relevant quantities e.g. velocities, stress, rate of deformation, pressure, and finite strain. Folder contains a separate application, which illustrates analytical solutions of growth rate spectra for layer parallel shortening and extension of a single viscous layer. In the study, we also demonstrate a Folder application, where the role of confinement on the growth rate spectrum and the fold shape evolution during the deformation of a single layer subject to the layer parallel shortening is presented. In the case of the linear viscous materials used for the layer and matrix, the close wall proximity leads to a decrease of the growth rate values. The decrease is more pronounced for the larger wavelengths than for the smaller wavelengths. The growth rate reduction is greater when the walls are set closer to the layer. The presence of the close confinement can also affect the wavelength selection process and significantly shift the position of the dominant wavelength. The influence of the wall proximity on the growth rate spectrum for the case of non-linear viscous materials used for the layer and/or matrix is very different as compared to the linear viscous case. We observe a multiple maxima in the growth rate spectrum. The number of the growth rate maxima, their value and the position strongly depend on the closeness of the confinement. The maximum growth rate value for a selected range of layer-wall distances is much larger than in the case when the confinement effect is not taken

  17. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  18. Hard- and software of real time simulation tools of Electric Power System for adequate modeling power semiconductors in voltage source convertor based HVDC and FACTS

    Directory of Open Access Journals (Sweden)

    Ufa Ruslan A.

    2014-01-01

    Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of Flexible Alternating Current Transmission System (FACTS devices and High Voltage Direct Current Transmission (HVDC system as part of real electric power systems (EPS. For that, a hybrid approach for advanced simulation of the FACTS and HVDC based on Voltage Source is proposed. The presented simulation results of the developed hybrid model of VSC confirm the achievement of the desired properties of the model and the effectiveness of the proposed solutions.

  19. Modeling bacteriophage amplification as a predictive tool for optimized MALDI-TOF MS-based bacterial detection.

    Science.gov (United States)

    Cox, Christopher R; Rees, Jon C; Voorhees, Kent J

    2012-11-01

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a valuable tool for rapid bacterial detection and identification but is limited by the need for relatively high cell count samples, which have been grown under strictly controlled conditions. These requirements can be eliminated by the natural infection of a viable bacterial species of interest with a host-specific phage. This produces a rapid increase in phage protein concentrations in comparison to bacterial concentrations, which can in turn be exploited as a method for signal amplification during MALDI-TOF MS. One drawback to this approach is the requirement for repetitive, time-consuming sample preparation and analysis applied over the course of a phage infection to monitor phage concentrations as a function of time to determine the MALDI-TOF MS detection limit. To reduce the requirement for repeated preparation and analysis, a modified phage therapy model was investigated as a means for predicting the time during a given phage infection when a detectable signal would occur. The modified model used a series of three differential equations composed of predetermined experimental parameters including phage burst size and burst time to predict progeny phage concentrations as a function of time. Using Yersinia pestis with plague diagnostic phage φA1122 and Escherichia coli with phage MS2 as two separate, well-characterized model phage-host pairs, we conducted in silico modeling of the infection process and compared it with experimental infections monitored in real time by MALDI-TOF MS. Significant agreement between mathematically calculated phage growth curves and those experimentally obtained by MALDI-TOF MS was observed, thus verifying this method's utility for significant time and labor reduction.

  20. PROCARB: A Database of Known and Modelled Carbohydrate-Binding Protein Structures with Sequence-Based Prediction Tools

    Directory of Open Access Journals (Sweden)

    Adeel Malik

    2010-01-01

    Full Text Available Understanding of the three-dimensional structures of proteins that interact with carbohydrates covalently (glycoproteins as well as noncovalently (protein-carbohydrate complexes is essential to many biological processes and plays a significant role in normal and disease-associated functions. It is important to have a central repository of knowledge available about these protein-carbohydrate complexes as well as preprocessed data of predicted structures. This can be significantly enhanced by tools de novo which can predict carbohydrate-binding sites for proteins in the absence of structure of experimentally known binding site. PROCARB is an open-access database comprising three independently working components, namely, (i Core PROCARB module, consisting of three-dimensional structures of protein-carbohydrate complexes taken from Protein Data Bank (PDB, (ii Homology Models module, consisting of manually developed three-dimensional models of N-linked and O-linked glycoproteins of unknown three-dimensional structure, and (iii CBS-Pred prediction module, consisting of web servers to predict carbohydrate-binding sites using single sequence or server-generated PSSM. Several precomputed structural and functional properties of complexes are also included in the database for quick analysis. In particular, information about function, secondary structure, solvent accessibility, hydrogen bonds and literature reference, and so forth, is included. In addition, each protein in the database is mapped to Uniprot, Pfam, PDB, and so forth.

  1. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    Science.gov (United States)

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  2. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  3. Fire behavior modeling-a decision tool

    Science.gov (United States)

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  4. Spatial Modeling Tools for Cell Biology

    Science.gov (United States)

    2006-10-01

    34 iv Figure 5.1: Computational results for a diffusion problem on planar square thin film............ 36 Figure 5.2... Wisc . Open Microscopy Env. Pre-CoBi Model Lib. CFDRC CoBi Tools CFDRC CoBi Tools Simulation Environment JigCell Tools Figure 4.1: Cell biology

  5. On the Development of a Java-Based Tool for Multifidelity Modeling of Coupled Systems LDRD Final Report

    CERN Document Server

    Gardner, D R; Gonzáles, M A; Hennigan, G L; Young, M

    2002-01-01

    This report describes research and development of methods to couple vastly different subsystems and physical models and to encapsulate these methods in a Java(trademark)-based framework. The work described here focused on developing a capability to enable design engineers and safety analysts to perform multifidelity, multiphysics analyses more simply. In particular this report describes a multifidelity algorithm for thermal radiative heat transfer and illustrates its performance. Additionally, it describes a module-based computer software architecture that facilitates multifidelity, multiphysics simulations. The architecture is currently being used to develop an environment for modeling the effects of radiation on electronic circuits in support of the FY 2003 Hostile Environments Milestone for the Accelerated Strategic Computing Initiative.

  6. A Performance-Based Web Budget Tool

    Science.gov (United States)

    Abou-Sayf, Frank K.; Lau, Wilson

    2007-01-01

    A web-based formula-driven tool has been developed for the purpose of performing two distinct academic department budgeting functions: allocation funding to the department, and budget management by the department. The tool's major features are discussed and its uses demonstrated. The tool's advantages are presented. (Contains 10 figures.)

  7. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources......, are the needed models for such a framework available? Or, are modelling tools that can help to develop the needed models available? Can such a model-based framework provide the needed model-based work-flows matching the requirements of the specific chemical product-process design problems? What types of models...

  8. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an estimat

  9. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 1 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  10. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 2 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  11. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define...... a taxonomy of aspects around conservation, constraints and constitutive relations. Aspects of the ICAS-MoT toolbox are given to illustrate the functionality of a computer aided modelling tool, which incorporates an interface to MS Excel....

  12. GridTool: A surface modeling and grid generation tool

    Science.gov (United States)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  13. XLISP-Stat Tools for Building Generalised Estimating Equation Models

    Directory of Open Access Journals (Sweden)

    Thomas Lumley

    1996-12-01

    Full Text Available This paper describes a set of Lisp-Stat tools for building Generalised Estimating Equation models to analyse longitudinal or clustered measurements. The user interface is based on the built-in regression and generalised linear model prototypes, with the addition of object-based error functions, correlation structures and model formula tools. Residual and deletion diagnostic plots are available on the cluster and observation level and use the dynamic graphics capabilities of Lisp-Stat.

  14. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  15. DATA QUALITY TOOLS FOR DATAWAREHOUSE MODELS

    Directory of Open Access Journals (Sweden)

    JASPREETI SINGH

    2015-05-01

    Full Text Available Data quality tools aim at detecting and correcting data problems that influence the accuracy and efficiency of data analysis applications. Data warehousing activities require data quality tools to ready the data and ensure that clean data populates the warehouse, thus raising usability of the warehouse. This research targets on the problems in the data that are addressed by data quality tools. We classify data quality tools based on datawarehouse stages and features of tool; which address the data quality problems and understand their functionalities.

  16. GPCR-SSFE 2.0-a fragment-based molecular modeling web tool for Class A G-protein coupled receptors.

    Science.gov (United States)

    Worth, Catherine L; Kreuchwig, Franziska; Tiemann, Johanna K S; Kreuchwig, Annika; Ritschel, Michele; Kleinau, Gunnar; Hildebrand, Peter W; Krause, Gerd

    2017-06-05

    G-protein coupled receptors (GPCRs) are key players in signal transduction and therefore a large proportion of pharmaceutical drugs target these receptors. Structural data of GPCRs are sparse yet important for elucidating the molecular basis of GPCR-related diseases and for performing structure-based drug design. To ameliorate this problem, GPCR-SSFE 2.0 (http://www.ssfa-7tmr.de/ssfe2/), an intuitive web server dedicated to providing three-dimensional Class A GPCR homology models has been developed. The updated web server includes 27 inactive template structures and incorporates various new functionalities. Uniquely, it uses a fingerprint correlation scoring strategy for identifying the optimal templates, which we demonstrate captures structural features that sequence similarity alone is unable to do. Template selection is carried out separately for each helix, allowing both single-template models and fragment-based models to be built. Additionally, GPCR-SSFE 2.0 stores a comprehensive set of pre-calculated and downloadable homology models and also incorporates interactive loop modeling using the tool SL2, allowing knowledge-based input by the user to guide the selection process. For visual analysis, the NGL viewer is embedded into the result pages. Finally, blind-testing using two recently published structures shows that GPCR-SSFE 2.0 performs comparably or better than other state-of-the art GPCR modeling web servers. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  18. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  19. Constraint-based model of Shewanella oneidensis MR-1 metabolism: a tool for data analysis and hypothesis generation.

    Directory of Open Access Journals (Sweden)

    Grigoriy E Pinchuk

    2010-06-01

    Full Text Available Shewanellae are gram-negative facultatively anaerobic metal-reducing bacteria commonly found in chemically (i.e., redox stratified environments. Occupying such niches requires the ability to rapidly acclimate to changes in electron donor/acceptor type and availability; hence, the ability to compete and thrive in such environments must ultimately be reflected in the organization and utilization of electron transfer networks, as well as central and peripheral carbon metabolism. To understand how Shewanella oneidensis MR-1 utilizes its resources, the metabolic network was reconstructed. The resulting network consists of 774 reactions, 783 genes, and 634 unique metabolites and contains biosynthesis pathways for all cell constituents. Using constraint-based modeling, we investigated aerobic growth of S. oneidensis MR-1 on numerous carbon sources. To achieve this, we (i used experimental data to formulate a biomass equation and estimate cellular ATP requirements, (ii developed an approach to identify cycles (such as futile cycles and circulations, (iii classified how reaction usage affects cellular growth, (iv predicted cellular biomass yields on different carbon sources and compared model predictions to experimental measurements, and (v used experimental results to refine metabolic fluxes for growth on lactate. The results revealed that aerobic lactate-grown cells of S. oneidensis MR-1 used less efficient enzymes to couple electron transport to proton motive force generation, and possibly operated at least one futile cycle involving malic enzymes. Several examples are provided whereby model predictions were validated by experimental data, in particular the role of serine hydroxymethyltransferase and glycine cleavage system in the metabolism of one-carbon units, and growth on different sources of carbon and energy. This work illustrates how integration of computational and experimental efforts facilitates the understanding of microbial metabolism at a

  20. Play it forward: A Game-based tool for Sustainable Product and Business Model Innovation in the Fuzzy Front End

    NARCIS (Netherlands)

    Dewulf, K.R.

    2010-01-01

    Dealing with sustainability in the fuzzy front end of innovation is complex and often hard. There are a number of tools available to guide designers, engineers and managers in the design process after the specifications of the product or service are already set, but methods supporting goal finding

  1. Play it forward: A Game-based tool for Sustainable Product and Business Model Innovation in the Fuzzy Front End

    NARCIS (Netherlands)

    Dewulf, K.R.

    2010-01-01

    Dealing with sustainability in the fuzzy front end of innovation is complex and often hard. There are a number of tools available to guide designers, engineers and managers in the design process after the specifications of the product or service are already set, but methods supporting goal finding f

  2. Play it forward: A Game-based tool for Sustainable Product and Business Model Innovation in the Fuzzy Front End

    NARCIS (Netherlands)

    Dewulf, K.R.

    2010-01-01

    Dealing with sustainability in the fuzzy front end of innovation is complex and often hard. There are a number of tools available to guide designers, engineers and managers in the design process after the specifications of the product or service are already set, but methods supporting goal finding f

  3. Simulation Tool for Inventory Models: SIMIN

    OpenAIRE

    Pratiksha Saxen; Tulsi Kushwaha

    2014-01-01

    In this paper, an integrated simulation optimization model for the inventory system is developed. An effective algorithm is developed to evaluate and analyze the back-end stored simulation results. This paper proposes simulation tool SIMIN (Inventory Simulation) to simulate inventory models. SIMIN is a tool which simulates and compares the results of different inventory models. To overcome various practical restrictive assumptions, SIMIN provides values for a number of performance measurement...

  4. 基于B/S的组合机床设计系统开发%Combination Machine TOOL System Based On B/S Model

    Institute of Scientific and Technical Information of China (English)

    郭志强; 李月琴

    2011-01-01

    Taking PRO/E、SQL2005 database and VS.net programming software as developing flat, using the important tools of TOP-DOWN designing methods,setting up the parametric 3D modeling module database combination machine tool system based on B/S model is developed.Under the guiding of the combination of machine tool system, users can complete such tasks as module choice, the driving force transmission device' s power calculation, module auto-assembly, and automatically generate two-dimensional drawings, etc.Therefore, combination machine tool design System can be realized, improving the design efficiency,shorten the product design cycle and reduced the cost of production.%利用PRO/E软件、SQL2005数据库和VS.net编程软件为开发平台,采用Top-Down技术,建立了组合机床的参数化三维模型图库并开发了基于B/S的组合机床异地设计系统.用户通过浏览器在组合机床设计系统的引导下可完成对组合机床设计的动力头功率计算模块、自动匹配组合机床模块、自动装配、生成三图一卡等总体设计和主轴箱等模块的详细设计,提高了产品的开发能力,缩短了产品的设计周期,降低了生产成本.

  5. ANSYS tools in modeling tires

    Science.gov (United States)

    Ali, Ashraf; Lovell, Michael

    1995-08-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  6. The mathematical and computer modeling of the worm tool shaping

    Science.gov (United States)

    Panchuk, K. L.; Lyashkov, A. A.; Ayusheev, T. V.

    2017-06-01

    Traditionally mathematical profiling of the worm tool is carried out on the first T. Olivier method, known in the theory of gear gearings, with receiving an intermediate surface of the making lath. It complicates process of profiling and its realization by means of computer 3D-modeling. The purpose of the work is the improvement of mathematical model of profiling and its realization based on the methods of 3D-modeling. Research problems are: receiving of the mathematical model of profiling which excludes the presence of the making lath in it; realization of the received model by means of frame and superficial modeling; development and approbation of technology of solid-state modeling for the solution of the problem of profiling. As the basic, the kinematic method of research of the mutually envelope surfaces is accepted. Computer research is executed by means of CAD based on the methods of 3D-modeling. We have developed mathematical model of profiling of the worm tool; frame, superficial and solid-state models of shaping of the mutually enveloping surfaces of the detail and the tool are received. The offered mathematical models and the technologies of 3D-modeling of shaping represent tools for theoretical and experimental profiling of the worm tool. The results of researches can be used at design of metal-cutting tools.

  7. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  8. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  9. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sourc

  10. A Components Library System Model and the Support Tool

    Institute of Scientific and Technical Information of China (English)

    MIAO Huai-kou; LIU Hui; LIU Jing; LI Xiao-bo

    2004-01-01

    Component-based development needs a well-designed components library and a set of support tools.This paper presents the design and implementation of a components library system model and its support tool UMLCASE.A set of practical CASE tools is constructed.UMLCASE can use UML to design Use Case Diagram, Class Diagram etc.And it integrates with components library system.

  11. A Hybrid Tool for User Interface Modeling and Prototyping

    Science.gov (United States)

    Trætteberg, Hallvard

    Although many methods have been proposed, model-based development methods have only to some extent been adopted for UI design. In particular, they are not easy to combine with user-centered design methods. In this paper, we present a hybrid UI modeling and GUI prototyping tool, which is designed to fit better with IS development and UI design traditions. The tool includes a diagram editor for domain and UI models and an execution engine that integrates UI behavior, live UI components and sample data. Thus, both model-based user interface design and prototyping-based iterative design are supported

  12. Knowledge base navigator facilitating regional analysis inter-tool communication.

    Energy Technology Data Exchange (ETDEWEB)

    Hampton, Jeffery Wade; Chael, Eric Paul; Hart, Darren M.; Merchant, Bion John; Chown, Matthew N.

    2004-08-01

    To make use of some portions of the National Nuclear Security Administration (NNSA) Knowledge Base (KB) for which no current operational monitoring applications were available, Sandia National Laboratories have developed a set of prototype regional analysis tools (MatSeis, EventID Tool, CodaMag Tool, PhaseMatch Tool, Dendro Tool, Infra Tool, etc.), and we continue to maintain and improve these. Individually, these tools have proven effective in addressing specific monitoring tasks, but collectively their number and variety tend to overwhelm KB users, so we developed another application - the KB Navigator - to launch the tools and facilitate their use for real monitoring tasks. The KB Navigator is a flexible, extensible java application that includes a browser for KB data content, as well as support to launch any of the regional analysis tools. In this paper, we will discuss the latest versions of KB Navigator and the regional analysis tools, with special emphasis on the new overarching inter-tool communication methodology that we have developed to make the KB Navigator and the tools function together seamlessly. We use a peer-to-peer communication model, which allows any tool to communicate with any other. The messages themselves are passed as serialized XML, and the conversion from Java to XML (and vice versa) is done using Java Architecture for XML Binding (JAXB).

  13. miRNA expression profiling in a human stem cell-based model as a tool for developmental neurotoxicity testing

    OpenAIRE

    2013-01-01

    The main aim of this study was to evaluate whether microRNA (miRNA) profiling could be a useful tool for in vitro developmental neurotoxicity (DNT) testing. Therefore, to identify the possible DNT biomarkers among miRNAs, we have studied the changes in miRNA expressions in a mixed neuronal/glial culture derived from carcinoma pluripotent stem cells (NT2 cell line) after exposure to MetHgCl during the process of neuronal differentiation (2-36 DIV). The obtained results identified the presence ...

  14. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels......, and it is investigated how a non-uniform airflow influences the refrigerant mass flow rate distribution and the total cooling capacity of the heat exchanger. It is shown that the cooling capacity decreases significantly with increasing maldistribution of the airflow. Comparing the two simulation tools it is found...

  15. MATT: Multi Agents Testing Tool Based Nets within Nets

    Directory of Open Access Journals (Sweden)

    Sara Kerraoui

    2016-12-01

    As part of this effort, we propose a model based testing approach for multi agent systems based on such a model called Reference net, where a tool, which aims to providing a uniform and automated approach is developed. The feasibility and the advantage of the proposed approach are shown through a short case study.

  16. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    -friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend and/or adopt a model. This is based on the idea of model reuse, which emphasizes the use...... and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer...... aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation; 4) model transfer – export/import to/from other application for further extension and application – several types of formats, such as XML...

  17. Using ComBase Predictor and Pathogen Modeling Program as support tools in outbreak investigation: an example from Denmark

    DEFF Research Database (Denmark)

    Møller, Cleide; Hansen, Tina Beck; Andersen, Jens Kirk

    of salt to the batter. A deterministic model was constructed in Microsoft Excel using information on the production of the implicated sausage. This model predicted the level of Y. enterocolitica to increase 2.3, 4.2 and 7.8 log-units during fermentation, drying and storage, respectively. At the point...

  18. Using ComBase Predictor and Pathogen Modeling Program as support tools in outbreak investigation: an example from Denmark

    DEFF Research Database (Denmark)

    Møller, Cleide; Hansen, Tina Beck; Andersen, Jens Kirk

    2009-01-01

    of salt to the batter. A deterministic model was constructed in Microsoft Excel using information on the production of the implicated sausage. This model predicted the level of Y. enterocolitica to increase 2.3, 4.2 and 7.8 log-units during fermentation, drying and storage, respectively. At the point...

  19. A comprehensive tool for image-based generation of fetus and pregnant women mesh models for numerical dosimetry studies

    Science.gov (United States)

    Dahdouh, S.; Varsier, N.; Serrurier, A.; De la Plata, J.-P.; Anquez, J.; Angelini, E. D.; Wiart, J.; Bloch, I.

    2014-08-01

    Fetal dosimetry studies require the development of accurate numerical 3D models of the pregnant woman and the fetus. This paper proposes a 3D articulated fetal growth model covering the main phases of pregnancy and a pregnant woman model combining the utero-fetal structures and a deformable non-pregnant woman body envelope. The structures of interest were automatically or semi-automatically (depending on the stage of pregnancy) segmented from a database of images and surface meshes were generated. By interpolating linearly between fetal structures, each one can be generated at any age and in any position. A method is also described to insert the utero-fetal structures in the maternal body. A validation of the fetal models is proposed, comparing a set of biometric measurements to medical reference charts. The usability of the pregnant woman model in dosimetry studies is also investigated, with respect to the influence of the abdominal fat layer.

  20. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  1. Stress urinary incontinence animal models as a tool to study cell-based regenerative therapies targeting the urethral sphincter.

    Science.gov (United States)

    Herrera-Imbroda, Bernardo; Lara, María F; Izeta, Ander; Sievert, Karl-Dietrich; Hart, Melanie L

    2015-03-01

    Urinary incontinence (UI) is a major health problem causing a significant social and economic impact affecting more than 200million people (women and men) worldwide. Over the past few years researchers have been investigating cell therapy as a promising approach for the treatment of stress urinary incontinence (SUI) since such an approach may improve the function of a weakened sphincter. Currently, a diverse collection of SUI animal models is available. We describe the features of the different models of SUI/urethral dysfunction and the pros and cons of these animal models in regard to cell therapy applications. We also discuss different cell therapy approaches and cell types tested in preclinical animal models. Finally, we propose new research approaches and perspectives to ensure the use of cellular therapy becomes a real treatment option for SUI.

  2. A pandemic influenza modeling and visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Maciejewski, Ross; Livengood, Philip; Rudolph, Stephen; Collins, Timothy F.; Ebert, David S.; Brigantic, Robert T.; Corley, Courtney D.; Muller, George A.; Sanders, Stephen W.

    2011-08-01

    The National Strategy for Pandemic Influenza outlines a plan for community response to a potential pandemic. In this outline, state and local communities are charged with enhancing their preparedness. In order to help public health officials better understand these charges, we have developed a modeling and visualization toolkit (PanViz) for analyzing the effect of decision measures implemented during a simulated pandemic influenza scenario. Spread vectors based on the point of origin and distance traveled over time are calculated and the factors of age distribution and population density are taken into effect. Healthcare officials are able to explore the effects of the pandemic on the population through a spatiotemporal view, moving forward and backward through time and inserting decision points at various days to determine the impact. Linked statistical displays are also shown, providing county level summaries of data in terms of the number of sick, hospitalized and dead as a result of the outbreak. Currently, this tool has been deployed in Indiana State Department of Health planning and preparedness exercises, and as an educational tool for demonstrating the impact of social distancing strategies during the recent H1N1 (swine flu) outbreak.

  3. 一种基于图形化建模的遥测参数配置工具%A Telemetry Parameter Configuration Tool Based on Graphic Modeling

    Institute of Scientific and Technical Information of China (English)

    罗毓芳; 李强; 韩洪波

    2012-01-01

    设计了一种基于图形化建模的适用于多星遥测参数配置的软件工具,将实际存在的物理对象作为独立的模块、使用相关图元和界面建立其逻辑模型,并可将遥测参数配置信息和图形模型绑定其中.同时根据遥测参数配置模型的特点,制定适用于遥测参数配置信息的描述规范,根据此规范对逻辑模型进行描述,最终模型以XML(可扩展标记语言)文件形式存储.该工具可在不同卫星之间复用.实践应用表明,该工具可以将配置过程非专业化和可视化,减少重复工作,降低了配置过程中的出错概率,提高配置工作的效率和质量,实现多星遥测参数配置图形化显示和配置工作的批量化.%A graphic modeling tool is designed for configuration of telemetry parameters of multiple satellites. Physical objects are represented as independent modules and graphic elements and GUI (Graphic User Interface) are used to establish their logical model, and telemetry parameter configuration information and graphic models can be bound to it. Specification for description of telemetry parameter configuration information is drawn up based on the characteristics of the model and the logic model is described with the specification. The ultimate model is stored in XML (eXtensible Markup Language) file format. The tool can be reused among different satellites. Engineering applications show that the tool visualizes the configuration process and makes the process more concise. At the same time, it reduces repetitive workload, lowers risks of mistakes and increases the efficiency and quality of configuration process by batch-processing multi-satellite telemetry parameter configuration.

  4. Plant microRNA-target interaction identification model based on the integration of prediction tools and support vector machine.

    Directory of Open Access Journals (Sweden)

    Jun Meng

    Full Text Available Confident identification of microRNA-target interactions is significant for studying the function of microRNA (miRNA. Although some computational miRNA target prediction methods have been proposed for plants, results of various methods tend to be inconsistent and usually lead to more false positive. To address these issues, we developed an integrated model for identifying plant miRNA-target interactions.Three online miRNA target prediction toolkits and machine learning algorithms were integrated to identify and analyze Arabidopsis thaliana miRNA-target interactions. Principle component analysis (PCA feature extraction and self-training technology were introduced to improve the performance. Results showed that the proposed model outperformed the previously existing methods. The results were validated by using degradome sequencing supported Arabidopsis thaliana miRNA-target interactions. The proposed model constructed on Arabidopsis thaliana was run over Oryza sativa and Vitis vinifera to demonstrate that our model is effective for other plant species.The integrated model of online predictors and local PCA-SVM classifier gained credible and high quality miRNA-target interactions. The supervised learning algorithm of PCA-SVM classifier was employed in plant miRNA target identification for the first time. Its performance can be substantially improved if more experimentally proved training samples are provided.

  5. Rapid response tools and datasets for post-fire modeling: Linking Earth Observations and process-based hydrological models to support post-fire remediation

    Science.gov (United States)

    M. E. Miller; M. Billmire; W. J. Elliot; K. A. Endsley; P. R. Robichaud

    2015-01-01

    Preparation is key to utilizing Earth Observations and process-based models to support post-wildfire mitigation. Post-fire flooding and erosion can pose a serious threat to life, property and municipal water supplies. Increased runoff and sediment delivery due to the loss of surface cover and fire-induced changes in soil properties are of great concern. Remediation...

  6. Geospatial simulation: an open-source, model-independent geospatial tool for managing point-based environmental models at multiple spatial locations

    Science.gov (United States)

    Many point-based models have been designed to simulate hydrology, gas flux, nutrient dynamics, and/or plant growth processes at a single point on the landscape. However, these environmental processes are known to be spatially variable. Simulations at different spatial locations therefore require adj...

  7. A model-independent open-source geospatial tool for managing point-based environmental model simulations at multiple spatial locations

    Science.gov (United States)

    Most environmental processes are known to be spatially variable across the landscape. However, many of the simulation models used to estimate and quantify these processes are point-based, meaning they simulate conditions at only one point on the landscape. The objective of this paper is to describ...

  8. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  9. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland wate

  10. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh;

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  11. Assessment of low contrast detection in CT using model observers. Developing a clinically-relevant tool for characterising adaptive statistical and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Ott, Julien G.; Ba, Alexandre; Racine, Damien; Viry, Anais; Bochud, Francois O.; Verdun, Francis R. [Univ. Hospital Lausanne (Switzerland). Inst. of Radiation Physics

    2017-08-01

    This study aims to assess CT image quality in a way that would meet specific requirements of clinical practice. Physics metrics like Fourier transform derived metrics were traditionally employed for that. However, assessment methods through a detection task have also developed quite extensively lately, and we chose here to rely on this modality for image quality assessment. Our goal was to develop a tool adapted for a fast and reliable CT image quality assessment in order to pave the way for new CT benchmarking techniques in a clinical context. Additionally, we also used this method to estimate the benefits brought by some IR algorithms. A modified QRM chest phantom containing spheres of 5 and 8 mm at contrast levels of 10 and 20 HU at 120 kVp was used. Images of the phantom were acquired at CTDI{sub vol} of 0.8, 3.6, 8.2 and 14.5 mGy, before being reconstructed using FBP, ASIR 40 and MBIR on a GE HD 750 CT scanner. They were then assessed by eight human observers undergoing a 4-AFC test. After that, these data were compared with the results obtained from two different model observers (NPWE and CHO with DDoG channels). The study investigated the effects of the acquisition conditions as well as reconstruction methods. NPWE and CHO models both gave coherent results and approximated human observer results well. Moreover, the reconstruction technique used to retrieve the images had a clear impact on the PC values. Both models suggest that switching from FBP to ASIR 40 and particularly to MBIR produces an increase of the low contrast detection, provided a minimum level of exposure is reached. Our work shows that both CHO with DDoG channels and NPWE models both approximate the trend of humans performing a detection task. Both models also suggest that the use of MBIR goes along with an increase of the PCs, indicating that further dose reduction is still possible when using those techniques. Eventually, the CHO model associated to the protocol we described in this study

  12. Homology modeling and metabolism prediction of human carboxylesterase-2 using docking analyses by GriDock: a parallelized tool based on AutoDock 4.0.

    Science.gov (United States)

    Vistoli, Giulio; Pedretti, Alessandro; Mazzolari, Angelica; Testa, Bernard

    2010-09-01

    Metabolic problems lead to numerous failures during clinical trials, and much effort is now devoted to developing in silico models predicting metabolic stability and metabolites. Such models are well known for cytochromes P450 and some transferases, whereas less has been done to predict the activity of human hydrolases. The present study was undertaken to develop a computational approach able to predict the hydrolysis of novel esters by human carboxylesterase hCES2. The study involved first a homology modeling of the hCES2 protein based on the model of hCES1 since the two proteins share a high degree of homology (congruent with 73%). A set of 40 known substrates of hCES2 was taken from the literature; the ligands were docked in both their neutral and ionized forms using GriDock, a parallel tool based on the AutoDock4.0 engine which can perform efficient and easy virtual screening analyses of large molecular databases exploiting multi-core architectures. Useful statistical models (e.g., r (2) = 0.91 for substrates in their unprotonated state) were calculated by correlating experimental pK(m) values with distance between the carbon atom of the substrate's ester group and the hydroxy function of Ser228. Additional parameters in the equations accounted for hydrophobic and electrostatic interactions between substrates and contributing residues. The negatively charged residues in the hCES2 cavity explained the preference of the enzyme for neutral substrates and, more generally, suggested that ligands which interact too strongly by ionic bonds (e.g., ACE inhibitors) cannot be good CES2 substrates because they are trapped in the cavity in unproductive modes and behave as inhibitors. The effects of protonation on substrate recognition and the contrasting behavior of substrates and products were finally investigated by MD simulations of some CES2 complexes.

  13. A mechanism-based mathematical model of aryl hydrocarbon receptor-mediated CYP1A induction in rats using beta-naphthoflavone as a tool compound.

    Science.gov (United States)

    Chen, Emile P; Chen, Liangfu; Ji, Yan; Tai, Guoying; Wen, Yuan H; Ellens, Harma

    2010-12-01

    β-Naphthoflavone (BNF) is a synthetic flavone that selectively and potently induces CYP1A enzymes via aryl hydrocarbon receptor activation. Mechanism-based mathematical models of CYP1A enzyme induction were developed to predict the time course of enzyme induction and quantitatively evaluate the interrelationship between BNF plasma concentrations, hepatic CYP1A1 and CYP1A2 mRNA levels, and CYP1A enzyme activity in rats in vivo. Male Sprague-Dawley rats received a continuous intravenous infusion of vehicle or 1.5 or 6 mg · kg(-1) · h(-1) BNF for 6 h, with blood and liver sampling. Plasma BNF concentrations were determined by liquid chromatography-tandem mass spectrometry. Hepatic mRNA levels of CYP1A1 and CYP1A2 were determined by TaqMan. Ethoxyresorufin O-deethylation was used to measure the increase in CYP1A enzyme activity as a result of induction. The induction of hepatic CYP1A1/CYP1A2 mRNA and CYP1A activity occurred within 2 h after BNF administration. This caused a rapid increase in metabolic clearance of BNF, resulting in plasma concentrations declining during the infusion. Overall, the enzyme induction models developed in this study adequately captured the time course of BNF pharmacokinetics, CYP1A1/CYP1A2 mRNA levels, and increases in CYP1A enzyme activity data for both dose groups simultaneously. The model-predicted degradation half-life of CYP1A enzyme activity is comparable with previously reported values. The present results also confirm a previous in vitro finding that CYP1A1 is the predominant contributor to CYP1A induction. These physiologically based models provide a basis for predicting drug-induced toxicity in humans from in vitro and preclinical data and can be a valuable tool in drug development.

  14. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    Science.gov (United States)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  15. Knowledge Based Product Configuration - a documentatio tool for configuration projects

    DEFF Research Database (Denmark)

    Hvam, Lars; Malis, Martin

    2003-01-01

    How can complex product models be documented in a formalised way that consider both development and maintenance? The need for an effective documentation tool has emerged in order to document the development of product models. The product models have become more and more complex and comprehensive....... with the development of a Lotus Notes application that serves as a knowledge based documentation tool for configuration projects. A prototype has been developed and tested empirically in an industrial case-company. It has proved to be a succes.......How can complex product models be documented in a formalised way that consider both development and maintenance? The need for an effective documentation tool has emerged in order to document the development of product models. The product models have become more and more complex and comprehensive...

  16. A review of electricity market modelling tools

    Directory of Open Access Journals (Sweden)

    Sandra Milena Londoño Hernández

    2010-05-01

    Full Text Available Deregulating electricity markets around the world in the search for efficiency has introduced competition into the electricity marke- ting and generation business. Studying interactions amongst the participants has thus acquired great importance for regulators and market participants for analysing market evolution and suitably defining their bidding strategies. Different tools have thereof- re been used for modelling competitive electricity markets during the last few years. This paper presents an analytical review of the bibliography found regarding this subject; it also presents the most used tools along with their advantages and disadvantages. Such analysis was done by comparing the models used, identifying the main market characteristics such as market structure, bid structure and kind of bidding. This analysis concluded that the kind of tool to be used mainly depends on a particular study’s goal and scope.

  17. HYDROLOGICAL PROCESSES MODELLING USING ADVANCED HYDROINFORMATIC TOOLS

    Directory of Open Access Journals (Sweden)

    BEILICCI ERIKA

    2014-03-01

    Full Text Available The water has an essential role in the functioning of ecosystems by integrating the complex physical, chemical, and biological processes that sustain life. Water is a key factor in determining the productivity of ecosystems, biodiversity and species composition. Water is also essential for humanity: water supply systems for population, agriculture, fisheries, industries, and hydroelectric power depend on water supplies. The modelling of hydrological processes is an important activity for water resources management, especially now, when the climate change is one of the major challenges of our century, with strong influence on hydrological processes dynamics. Climate change and needs for more knowledge in water resources require the use of advanced hydroinformatic tools in hydrological processes modelling. The rationale and purpose of advanced hydroinformatic tools is to develop a new relationship between the stakeholders and the users and suppliers of the systems: to offer the basis (systems which supply useable results, the validity of which cannot be put in reasonable doubt by any of the stakeholders involved. For a successful modelling of hydrological processes also need specialists well trained and able to use advanced hydro-informatics tools. Results of modelling can be a useful tool for decision makers to taking efficient measures in social, economical and ecological domain regarding water resources, for an integrated water resources management.

  18. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  19. Web-Based Learning Design Tool

    Science.gov (United States)

    Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.

    2012-01-01

    Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…

  20. Tools for the Knowledge-Based Organization

    DEFF Research Database (Denmark)

    Ravn, Ib

    2002-01-01

    1. EXECUTIVE SUMMARY • It is proposed that a consortium for research on and development of tools for the knowledge-based organization be established at Learning Lab Denmark. • The knowledge-based organizations must refine and use the knowledge held by its members and not confuse it with the infor......1. EXECUTIVE SUMMARY • It is proposed that a consortium for research on and development of tools for the knowledge-based organization be established at Learning Lab Denmark. • The knowledge-based organizations must refine and use the knowledge held by its members and not confuse...... it with the information held by its computers. Knowledge specialists cannot be managed and directed in the classical sense. The organization needs to be rehumanized and conditions for reflection, learning and autonomy enhanced, so that its collective knowledge may be better used to create real value for its stakeholders....... • To help organizations do this, tools need to be researched, sophisticated or invented. Broadly conceived, tools include ideas, such as theories, missions and business plans, practices, such as procedures and behaviors, and instruments, such as questionnaires, indicators, agendas and methods...

  1. Web-Based Learning Design Tool

    Science.gov (United States)

    Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.

    2012-01-01

    Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…

  2. Tool-Based Curricula and Visual Learning

    Directory of Open Access Journals (Sweden)

    Dragica Vasileska

    2013-12-01

    Full Text Available In the last twenty years nanotechnology hasrevolutionized the world of information theory, computers andother important disciplines, such as medicine, where it hascontributed significantly in the creation of more sophisticateddiagnostic tools. Therefore, it is important for people working innanotechnology to better understand basic concepts to be morecreative and productive. To further foster the progress onNanotechnology in the USA, the National Science Foundation hascreated the Network for Computational Nanotechnology (NCNand the dissemination of all the information from member andnon-member participants of the NCN is enabled by thecommunity website www.nanoHUB.org. nanoHUB’s signatureservices online simulation that enables the operation ofsophisticated research and educational simulation engines with acommon browser. No software installation or local computingpower is needed. The simulation tools as well as nano-conceptsare augmented by educational materials, assignments, and toolbasedcurricula, which are assemblies of tools that help studentsexcel in a particular area.As elaborated later in the text, it is the visual mode of learningthat we are exploiting in achieving faster and better results withstudents that go through simulation tool-based curricula. Thereare several tool based curricula already developed on thenanoHUB and undergoing further development, out of which fiveare directly related to nanoelectronics. They are: ABACUS –device simulation module; ACUTE – Computational Electronicsmodule; ANTSY – bending toolkit; and AQME – quantummechanics module. The methodology behind tool-based curriculais discussed in details. Then, the current status of each module ispresented, including user statistics and student learningindicatives. Particular simulation tool is explored further todemonstrate the ease by which students can grasp information.Representative of Abacus is PN-Junction Lab; representative ofAQME is PCPBT tool; and

  3. An analytical model for resistivity tools

    Energy Technology Data Exchange (ETDEWEB)

    Hovgaard, J.

    1991-04-01

    An analytical model for resistivity tools is developed. It takes into account the effect of the borehole and the actual shape of the electrodes. The model is two-dimensional, i.e. the model does not deal with eccentricity. The electrical potential around a current source satisfies Poisson`s equation. The method used here to solve Poisson`s equation is the expansion fo the potential function in terms of a complete set of functions involving one of the coordinates with coefficients which are undetermined functions of the other coordinate. Numerical examples of the use of the model are presented. The results are compared with results given in the literature. (au).

  4. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    . It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained...

  5. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  6. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  7. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  8. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  9. Dasy Based Tool for The Design of Ice Mechanisms

    Directory of Open Access Journals (Sweden)

    Tichánek Radek

    2015-12-01

    Full Text Available This article presents a tool for designing new mechanisms of internal combustion engines based on the DASY knowledge database. An OHC valve train has been chosen for developing and testing the presented tool. The tool includes both a kinematic and dynamic model connected to a crank train. Values of unknown parameters have been obtained using detailed calibration and consequent validation of three dynamic models with measured data. The values remain stored in DASY and many of them can be used directly to design new mechanisms, even in cases where the geometries of some parts are different. The paper presents three methods which have been used not only for the calibration, but also for the identification of the influence of unknown parameters on valve acceleration and its vibration. The tool has been used to design the cam shapes for a prototype of the new mechanism.

  10. Development of Computational Tools for Modeling Thermal and Radiation Effects on Grain Boundary Segregation and Precipitation in Fe-Cr-Ni-based Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Ying [ORNL

    2017-08-01

    This work aims at developing computational tools for modeling thermal and radiation effects on solute segregation at grain boundaries (GBs) and precipitation. This report described two major efforts. One is the development of computational tools on integrated modeling of thermal equilibrium segregation (TES) and radiation-induced segregation (RIS), from which synergistic effects of thermal and radiation, pre-existing GB segregation have been taken into consideration. This integrated modeling was used in describing the Cr and Ni segregation in the Fe-Cr-Ni alloys. The other effort is thermodynamic modeling on the Fe-Cr-Ni-Mo system which includes the major alloying elements in the investigated alloys in the Advanced Radiation Resistant Materials (ARRM) program. Through thermodynamic calculation, we provide baseline thermodynamic stability of the hardening phase Ni2(Cr,Mo) in selected Ni-based super alloys, and contribute knowledge on mechanistic understanding on the formation of Ni2(Cr,Mo) in the irradiated materials. The major outcomes from this work are listed in the following: 1) Under the simultaneous thermal and irradiation conditions, radiation-induced segregation played a dominant role in the GB segregation. The pre-existing GB segregation only affects the subsequent radiation-induced segregation in the short time. For the same element, the segregation tendency of Cr and Ni due to TES is opposite to it from RIS. The opposite tendency can lead to the formation of W-shape profile. These findings are consistent with literature observation of the transitory W-shape profile. 2) While TES only affects the distance of one or two atomic layers from GBs, the RIS can affect a broader distance from GB. Therefore, the W-shape due to pre-existing GB segregation is much narrower than that due to composition gradient formed during the transient state. Considering the measurement resolution of Auger or STEM analysis, the segregation tendency due to RIS should play a dominant

  11. A Model for Beliefs, Tool Acceptance Levels and Web Pedagogical Content Knowledge of Science and Technology Preservice Teachers towards Web Based Instruction

    Science.gov (United States)

    Horzum, Mehmet Baris; Canan Gungoren, Ozlem

    2012-01-01

    One of the applications applied most nowadays is web based instruction (WBI). Although there are many studies on WBI, no study which researched the relations between beliefs for WBI, WBI tools acceptance levels and web pedagogical content knowledge (WPCK) of science and technology pre-service teachers was found among these studies. The aim of this…

  12. Graphics-Based Parallel Programming Tools

    Science.gov (United States)

    1991-09-01

    AD-A254 406 (9 FINAL REPORT DLECTF ’AUG 13 1992 Graphics-Based Parallel Programming Tools .p Janice E. Cuny, Principal Investigator Department of...suggest parallel (either because we use a parallel graph rewriting mechanism or because we apply our results to parallel programming ), we interpret it to...was to provide support for the ex- plicit representation of graphs for use within a parallel programming environ- ment. In our environment, we view a

  13. A tool box for implementing supersymmetric models

    Science.gov (United States)

    Staub, Florian; Ohl, Thorsten; Porod, Werner; Speckner, Christian

    2012-10-01

    We present a framework for performing a comprehensive analysis of a large class of supersymmetric models, including spectrum calculation, dark matter studies and collider phenomenology. To this end, the respective model is defined in an easy and straightforward way using the Mathematica package SARAH. SARAH then generates model files for CalcHep which can be used with micrOMEGAs as well as model files for WHIZARD and O'Mega. In addition, Fortran source code for SPheno is created which facilitates the determination of the particle spectrum using two-loop renormalization group equations and one-loop corrections to the masses. As an additional feature, the generated SPheno code can write out input files suitable for use with HiggsBounds to apply bounds coming from the Higgs searches to the model. Combining all programs provides a closed chain from model building to phenomenology. Program summary Program title: SUSY Phenomenology toolbox. Catalog identifier: AEMN_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMN_v1_0.html. Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html. No. of lines in distributed program, including test data, etc.: 140206. No. of bytes in distributed program, including test data, etc.: 1319681. Distribution format: tar.gz. Programming language: Autoconf, Mathematica. Computer: PC running Linux, Mac. Operating system: Linux, Mac OS. Classification: 11.6. Nature of problem: Comprehensive studies of supersymmetric models beyond the MSSM is considerably complicated by the number of different tasks that have to be accomplished, including the calculation of the mass spectrum and the implementation of the model into tools for performing collider studies, calculating the dark matter density and checking the compatibility with existing collider bounds (in particular, from the Higgs searches). Solution method: The

  14. SMILE Maker: A Web-Based Tool for Problem Solving.

    Science.gov (United States)

    Stoyanov, Svetoslav; Aroyo, Lora; Kommers, Piet; Kurtev, Ivan

    This paper focuses on the purposes, theoretical model, and functionality of the SMILE (Solution Mapping Intelligent Learning Environment) Maker--a World Wide Web-based problem-solving tool. From an instructional design point of view, an attempt to establish a balance between constructivism/instructivism, content-treatment…

  15. Risk Reduction and Training using Simulation Based Tools - 12180

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Irin P. [Newport News Shipbuilding, Newport News, Virginia 23607 (United States)

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and S based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition

  16. Web Based Personal Nutrition Management Tool

    Science.gov (United States)

    Bozkurt, Selen; Zayim, Neşe; Gülkesen, Kemal Hakan; Samur, Mehmet Kemal

    Internet is being used increasingly as a resource for accessing health-related information because of its several advantages. Therefore, Internet tailoring becomes quite preferable in health education and personal health management recently. Today, there are many web based health programs de-signed for individuals. Among these studies nutrition and weight management is popular because, obesity has become a heavy burden for populations worldwide. In this study, we designed a web based personal nutrition education and management tool, The Nutrition Web Portal, in order to enhance patients’ nutrition knowledge, and provide behavioral change against obesity. The present paper reports analysis, design and development processes of The Nutrition Web Portal.

  17. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  18. CREST Cost of Renewable Energy Spreadsheet Tool: A Model for Developing Cost-based Incentives in the United States. User Manual Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Gifford, Jason S. [Sustainable Energy Advantage, LLC, Framingham, MA (United States); Grace, Robert C. [Sustainable Energy Advantage, LLC, Framingham, MA (United States)

    2011-03-01

    This user manual helps model users understands how to use the CREST model to support renewable energy incentives, FITs, and other renewable energy rate-setting processes. It reviews the spreadsheet tool, including its layout and conventions, offering context on how and why it was created. It also provides instructions on how to populate the model with inputs that are appropriate for a specific jurisdiction’s policymaking objectives and context. And, it describes the results and outlines how these results may inform decisions about long-term renewable energy support programs.

  19. Risk Assessment in Fractured Clayey Tills - Which Modeling Tools?

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Bjerg, Poul Løgstrup; Binning, Philip John

    2012-01-01

    assessment is challenging and the inclusion of the relevant processes is difficult. Furthermore the lack of long-term monitoring data prevents from verifying the accuracy of the different conceptual models. Further investigations based on long-term data and numerical modeling are needed to accurately......The article presents different tools available for risk assessment in fractured clayey tills and their advantages and limitations are discussed. Because of the complex processes occurring during contaminant transport through fractured media, the development of simple practical tools for risk...

  20. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained. It is fo......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained....... It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected...

  1. Web Based Tool for Mission Operations Scenarios

    Science.gov (United States)

    Boyles, Carole A.; Bindschadler, Duane L.

    2008-01-01

    A conventional practice for spaceflight projects is to document scenarios in a monolithic Operations Concept document. Such documents can be hundreds of pages long and may require laborious updates. Software development practice utilizes scenarios in the form of smaller, individual use cases, which are often structured and managed using UML. We have developed a process and a web-based scenario tool that utilizes a similar philosophy of smaller, more compact scenarios (but avoids the formality of UML). The need for a scenario process and tool became apparent during the authors' work on a large astrophysics mission. It was noted that every phase of the Mission (e.g., formulation, design, verification and validation, and operations) looked back to scenarios to assess completeness of requirements and design. It was also noted that terminology needed to be clarified and structured to assure communication across all levels of the project. Attempts to manage, communicate, and evolve scenarios at all levels of a project using conventional tools (e.g., Excel) and methods (Scenario Working Group meetings) were not effective given limitations on budget and staffing. The objective of this paper is to document the scenario process and tool created to offer projects a low-cost capability to create, communicate, manage, and evolve scenarios throughout project development. The process and tool have the further benefit of allowing the association of requirements with particular scenarios, establishing and viewing relationships between higher- and lower-level scenarios, and the ability to place all scenarios in a shared context. The resulting structured set of scenarios is widely visible (using a web browser), easily updated, and can be searched according to various criteria including the level (e.g., Project, System, and Team) and Mission Phase. Scenarios are maintained in a web-accessible environment that provides a structured set of scenario fields and allows for maximum

  2. An MCMC Circumstellar Disks Modeling Tool

    Science.gov (United States)

    Wolff, Schuyler; Perrin, Marshall D.; Mazoyer, Johan; Choquet, Elodie; Soummer, Remi; Ren, Bin; Pueyo, Laurent; Debes, John H.; Duchene, Gaspard; Pinte, Christophe; Menard, Francois

    2016-01-01

    We present an enhanced software framework for the Monte Carlo Markov Chain modeling of circumstellar disk observations, including spectral energy distributions and multi wavelength images from a variety of instruments (e.g. GPI, NICI, HST, WFIRST). The goal is to self-consistently and simultaneously fit a wide variety of observables in order to place constraints on the physical properties of a given disk, while also rigorously assessing the uncertainties in the derived properties. This modular code is designed to work with a collection of existing modeling tools, ranging from simple scripts to define the geometry for optically thin debris disks, to full radiative transfer modeling of complex grain structures in protoplanetary disks (using the MCFOST radiative transfer modeling code). The MCMC chain relies on direct chi squared comparison of model images/spectra to observations. We will include a discussion of how best to weight different observations in the modeling of a single disk and how to incorporate forward modeling from PCA PSF subtraction techniques. The code is open source, python, and available from github. Results for several disks at various evolutionary stages will be discussed.

  3. Accelerator Based Tools of Stockpile Stewardship

    Science.gov (United States)

    Seestrom, Susan

    2017-01-01

    The Manhattan Project had to solve difficult challenges in physics and materials science. During the cold war a large nuclear stockpile was developed. In both cases, the approach was largely empirical. Today that stockpile must be certified without nuclear testing, a task that becomes more difficult as the stockpile ages. I will discuss the role of modern accelerator based experiments, such as x-ray radiography, proton radiography, neutron and nuclear physics experiments, in stockpile stewardship. These new tools provide data of exceptional sensitivity and are answering questions about the stockpile, improving our scientific understanding, and providing validation for the computer simulations that are relied upon to certify todays' stockpile.

  4. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  5. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  6. First-Stage Development and Validation of a Web-Based Automated Dietary Modeling Tool: Using Constraint Optimization Techniques to Streamline Food Group and Macronutrient Focused Dietary Prescriptions for Clinical Trials

    Science.gov (United States)

    Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh

    2016-01-01

    Background Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. Objective This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Methods Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. Results The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Conclusions Automated modeling tools can streamline the modeling process for dietary intervention trials

  7. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    to enforce design decisions (modify the models). We define tool-independent concepts and architecture building blocks supporting these requirements and present first ideas how this can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless integration......Numerous design decisions including architectural decisions are made while developing a software system, which influence the architecture of the system as well as subsequent decisions. Several tools already exist for managing design decisions, i.e. capturing, documenting, and maintaining them......, but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...

  8. Hypermedia as an experiential learning tool: a theoretical model

    Directory of Open Access Journals (Sweden)

    Jose Miguel Baptista Nunes

    1996-01-01

    Full Text Available The process of methodical design and development is of extreme importance in the production of educational software. However, this process will only be effective, if it is based on a theoretical model that explicitly defines what educational approach is being used and how specific features of the technology can best support it. This paper proposes a theoretical model of how hypermedia can be used as an experiential learning tool. The development of the model was based on a experiential learning approach and simultaneously aims at minimising the inherent problems of hypermedia as the underlying support technology.

  9. Software tools overview : process integration, modelling and optimisation for energy saving and pollution reduction

    OpenAIRE

    Lam, Hon Loong; Klemeš, Jiri; Kravanja, Zdravko; Varbanov, Petar

    2012-01-01

    This paper provides an overview of software tools based on long experience andapplications in the area of process integration, modelling and optimisation. The first part reviews the current design practice and the development of supporting software tools. Those are categorised as: (1) process integration and retrofit analysis tools, (2) general mathematical modelling suites with optimisation libraries, (3) flowsheeting simulation and (4) graph-based process optimisation tools. The second part...

  10. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan; Baggu, Murali M.

    2017-05-11

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source to enable use by others.

  11. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan; Baggu, Murali M.

    2017-04-11

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source to enable use by others.

  12. Tool Steel Heat Treatment Optimization Using Neural Network Modeling

    Science.gov (United States)

    Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz

    2016-11-01

    Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.

  13. 基于全景可视化模型的研发管理工具%Developing management tool based on panoramic visual model

    Institute of Scientific and Technical Information of China (English)

    王亚玲; 杨超; 王涛; 时迎勇

    2015-01-01

    In order to improve the management efficiency of software development, a design idea and realization plan of model-driven research and development management tool was proposed. Different management strategies were generated by establishing models using visual designers, and management services were supplied by analyzing the strategies using a management engine. Firstly, panoramic visual models were designed from domain models and operation flow models, then the management engine analyzed the visual models and generated Web scripts and back-end service, finally completed the tool' s application. Practical application results indicate that this tool can monitor, manage and coordinate software projects more effective, reduce stress and cost of development management, reduce influence of risks, and assist project decision by previewing the project status in future. This tools had significant advantages in real-time management of multiple projects, and can be applied to projects of State Grid information construction.%为了提高软件研发的管理效率,提出了一种模型驱动的研发管理工具的设计思想和实现方案。使用可视化设计器建立模型生成不同的管理策略,采用管理引擎解析策略提供管理服务。首先从领域模型和业务流程模型两个方面设计全景可视化模型,然后管理引擎解析模型并生成网页脚本和后端服务,最终实现工具的应用。实际应用结果表明工具可更有效地进行软件项目监控、管理和协调,降低研发管理的压力和成本,减少风险带来的影响,通过预演未来的项目状态可以辅助项目决策。工具在多项目实时管理方面具有显著优势,可以应用于国家电网信息化建设的项目。

  14. A web service based tool to plan atmospheric research flights

    Directory of Open Access Journals (Sweden)

    M. Rautenhaus

    2011-09-01

    Full Text Available We present a web service based tool for the planning of atmospheric research flights. The tool provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS standard, a technology that has gained increased attention in meteorology in recent years. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. We have implemented the software using the open-source programming language Python. In the present article, we describe the architecture of the tool. As an example application, we discuss a case study research flight planned for the scenario of the 2010 Eyjafjalla volcano eruption. Usage and implementation details are provided as Supplement.

  15. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    Science.gov (United States)

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  17. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    Science.gov (United States)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  18. Bayesian Based Comment Spam Defending Tool

    CERN Document Server

    Nagamalai, Dhinaharan; Lee, Jae Kwang; 10.5121/ijnsa.2010.2420

    2010-01-01

    Spam messes up user's inbox, consumes network resources and spread worms and viruses. Spam is flooding of unsolicited, unwanted e mail. Spam in blogs is called blog spam or comment spam.It is done by posting comments or flooding spams to the services such as blogs, forums,news,email archives and guestbooks. Blog spams generally appears on guestbooks or comment pages where spammers fill a comment box with spam words. In addition to wasting user's time with unwanted comments, spam also consumes a lot of bandwidth. In this paper, we propose a software tool to prevent such blog spams by using Bayesian Algorithm based technique. It is derived from Bayes' Theorem. It gives an output which has a probability that any comment is spam, given that it has certain words in it. With using our past entries and a comment entry, this value is obtained and compared with a threshold value to find if it exceeds the threshold value or not. By using this concept, we developed a software tool to block comment spam. The experimental...

  19. AlgiMatrix™-Based 3D Cell Culture System as an In Vitro Tumor Model: An Important Tool in Cancer Research.

    Science.gov (United States)

    Godugu, Chandraiah; Singh, Mandip

    2016-01-01

    Routinely used two-dimensional cell culture-based models often fail while translating the observations into in vivo models. This setback is more common in cancer research, due to several reasons. The extracellular matrix and cell-to-cell interactions are not present in two-dimensional (2D) cell culture models. Diffusion of drug molecules into cancer cells is hindered by barriers of extracellular components in in vivo conditions, these barriers are absent in 2D cell culture models. To better mimic or simulate the in vivo conditions present in tumors, the current study used the alginate based three-dimensional cell culture (AlgiMatrix™) model, which resembles close to the in vivo tumor models. The current study explains the detailed protocols involved in AlgiMatrix™ based in vitro non-small-cell lung cancer (NSCLC) models. The suitability of this model was studied by evaluating, cytotoxicity, apoptosis, and penetration of nanoparticles into the in vitro tumor spheroids. This study also demonstrated the effect of EphA2 receptor targeted docetaxel-loaded nanoparticles on MDA-MB-468 TNBC cell lines. The methods section is subdivided into three subsections such as (1) preparation of AlgiMatrix™-based 3D in vitro tumor models and cytotoxicity assays, (2) free drug and nanoparticle uptake into spheroid studies, and (3) western blot, IHC, and RT-PCR studies.

  20. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  1. TRANSOL- Tool of energetical calculation for solar thermal system based in dynamic simulations of real models; Transol- Herramienta de calculo energetico para sistemas solares termicos basada en simulaciones dinamicas de modelos reales simplicados

    Energy Technology Data Exchange (ETDEWEB)

    Salom, J.; Schweiger, H.; Gonzalez, D.; Gurruchaga, J.; Grau, J.

    2004-07-01

    In recent years, the implementation, in different cities of Spain, of Municipal Solar Ordinances, as well as the stimulus of the aids and subsidies from the different administrations, has caused a growing proliferation of the thermal solar systems. This expansion and diversification has not gone accompanied by a regulation and a more exhaustive methodology of calculation neither by new calculation tools that would permit to evaluate the behavior of this diversity of thermal systems. It is proposed a new tool, TRANSOL, a software of accessible calculation for the designer, any its technical qualification would be, based on dynamic simulations of simplified real models of the solar systems of greater currently applications. (Author)

  2. The ADAPT Tool: From AADL Architectural Models to Stochastic Petri Nets through Model Transformation

    CERN Document Server

    Rugina, Ana E; Kaaniche, Mohamed

    2008-01-01

    ADAPT is a tool that aims at easing the task of evaluating dependability measures in the context of modern model driven engineering processes based on AADL (Architecture Analysis and Design Language). Hence, its input is an AADL architectural model annotated with dependability-related information. Its output is a dependability evaluation model in the form of a Generalized Stochastic Petri Net (GSPN). The latter can be processed by existing dependability evaluation tools, to compute quantitative measures such as reliability, availability, etc.. ADAPT interfaces OSATE (the Open Source AADL Tool Environment) on the AADL side and SURF-2, on the dependability evaluation side. In addition, ADAPT provides the GSPN in XML/XMI format, which represents a gateway to other dependability evaluation tools, as the processing techniques for XML files allow it to be easily converted to a tool-specific GSPN.

  3. Hepatocyte composition-based model as a mechanistic tool for predicting the cell suspension: aqueous phase partition coefficient of drugs in in vitro metabolic studies.

    Science.gov (United States)

    Poulin, Patrick; Haddad, Sami

    2013-08-01

    This study is an extension of a previously published microsome composition-based model by Poulin and Haddad (Poulin and Haddad. 2011. J Pharm Sci 100:4501-4517), which was converted to the hepatocyte composition-based model. The first objective was to investigate the ability of the composition-based model to predict nonspecific binding of drugs in hepatocytes suspended in the incubation medium in in vitro metabolic studies. The hepatocyte composition-based model describes the cell suspension-aqueous phase partition coefficients, which were used to estimate fraction unbound in the incubation medium (fuinc ) for each drug. The second objective was to make a comparative analysis between the proposed hepatocyte composition-based model and an empirical regression equation published in the literature by Austin et al. (Austin RP, Barton P, Mohmed S, Riley RJ. 2004. Drug Metab Dispos 33:419-425). The assessment was confined by the availability of experimentally determined in vitro fuinc values at diverse hepatocyte concentrations for 92 drugs. The model that made use of hepatocyte composition data provides comparable or superior prediction performance compared with the regression equation that relied solely on physicochemical data; therefore, this demonstrates the ability of predicting fuinc also based on mechanisms of drug tissue distribution. The accuracy of the predictions differed depending on the class of drugs (neutrals vs. ionized drugs) and species (rat vs. human) for each method. This study for hepatocytes corroborates a previous study for microsomes. Overall, this work represents a significant first step toward the development of a generic and mechanistic calculation method of fuinc in incubations of hepatocytes, which should facilitate rational interindividual and interspecies extrapolations of fuinc by considering differences in lipid composition of hepatocytes, for clearance prediction in the physiologically-based pharmacokinetics (PBPK) models.

  4. Modular target acquisition model & visualization tool

    NARCIS (Netherlands)

    Bijl, P.; Hogervorst, M.A.; Vos, W.K.

    2008-01-01

    We developed a software framework for image-based simulation models in the chain: scene-atmosphere-sensor-image enhancement-display-human observer: EO-VISTA. The goal is to visualize the steps and to quantify (Target Acquisition) task performance. EO-VISTA provides an excellent means to systematical

  5. 基于ETL的系统模型研究与开发%Research and Development of System Model Based on ETL Tools

    Institute of Scientific and Technical Information of China (English)

    刘荷花

    2012-01-01

    数据集成和交换必须解决抽取、转换和加载(Extract、Transform、Load),但过去的异构系统很难实现.针对数据抽取、转换和装载,通过结构化分析需求,从需求分析、系统设计和系统实现3个方面进行了设计,构建了基于ETL的系统逻辑模型.探究了ETL总体框架、UI和类包;创新了UI界面、数据处理的若干关键技术;实现了预期的目标,有助于后续数据仓库的建立.%Data integration and exchange must solve the extraction, transformation and loading. But the past is difficult to achieve heterogeneous systems. Aiming at ETL tool for data extraction, transformation and loading (Extract, Transform, Load), from requirements analysis, system design and system to achieve three aspects of the design. According to the comprehensive requirements of ETL tools, through structural analysis needs, a system logic model is built; a general framework, UI, and class package are discassed.The UI interface innovations, data processing, a number of key technologies are created system goals are achieved.

  6. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D; Halford, Keith J.; Binley, Andrew; Lane, John; Werkema, Dale

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  7. Using EPA Tools and Data Services to Inform Changes to Design Storm Definitions for Wastewater Utilities based on Climate Model Projections

    Science.gov (United States)

    Tryby, M.; Fries, J. S.; Baranowski, C.

    2014-12-01

    Extreme precipitation events can cause significant impacts to drinking water and wastewater utilities, including facility damage, water quality impacts, service interruptions and potential risks to human health and the environment due to localized flooding and combined sewer overflows (CSOs). These impacts will become more pronounced with the projected increases in frequency and intensity of extreme precipitation events due to climate change. To model the impacts of extreme precipitation events, wastewater utilities often develop Intensity, Duration, and Frequency (IDF) rainfall curves and "design storms" for use in the U.S. Environmental Protection Agency's (EPA) Storm Water Management Model (SWMM). Wastewater utilities use SWMM for planning, analysis, and facility design related to stormwater runoff, combined and sanitary sewers, and other drainage systems in urban and non-urban areas. SWMM tracks (1) the quantity and quality of runoff made within each sub-catchment; and (2) the flow rate, flow depth, and quality of water in each pipe and channel during a simulation period made up of multiple time steps. In its current format, EPA SWMM does not consider climate change projection data. Climate change may affect the relationship between intensity, duration, and frequency described by past rainfall events. Therefore, EPA is integrating climate projection data available in the Climate Resilience Evaluation and Awareness Tool (CREAT) into SWMM. CREAT is a climate risk assessment tool for utilities that provides downscaled climate change projection data for changes in the amount of rainfall in a 24-hour period for various extreme precipitation events (e.g., from 5-year to 100-year storm events). Incorporating climate change projections into SWMM will provide wastewater utilities with more comprehensive data they can use in planning for future storm events, thereby reducing the impacts to the utility and customers served from flooding and stormwater issues.

  8. Computer-Based Cognitive Tools: Description and Design.

    Science.gov (United States)

    Kennedy, David; McNaught, Carmel

    With computers, tangible tools are represented by the hardware (e.g., the central processing unit, scanners, and video display unit), while intangible tools are represented by the software. There is a special category of computer-based software tools (CBSTs) that have the potential to mediate cognitive processes--computer-based cognitive tools…

  9. General model for boring tool optimization

    Science.gov (United States)

    Moraru, G. M.; rbes, M. V. Ze; Popescu, L. G.

    2016-08-01

    Optimizing a tool (and therefore those for boring) consist in improving its performance through maximizing the objective functions chosen by the designer and/or by user. In order to define and to implement the proposed objective functions, contribute numerous features and performance required by tool users. Incorporation of new features makes the cutting tool to be competitive in the market and to meet user requirements.

  10. Kernel-Based, Partial Least Squares Quantitative Structure-Retention Relationship Model for UPLC Retention Time Prediction: A Useful Tool for Metabolite Identification.

    Science.gov (United States)

    Falchi, Federico; Bertozzi, Sine Mandrup; Ottonello, Giuliana; Ruda, Gian Filippo; Colombano, Giampiero; Fiorelli, Claudio; Martucci, Cataldo; Bertorelli, Rosalia; Scarpelli, Rita; Cavalli, Andrea; Bandiera, Tiziano; Armirotti, Andrea

    2016-10-04

    We propose a new QSRR model based on a Kernel-based partial least-squares method for predicting UPLC retention times in reversed phase mode. The model was built using a combination of classical (physicochemical and topological) and nonclassical (fingerprints) molecular descriptors of 1383 compounds, encompassing different chemical classes and structures and their accurately measured retention time values. Following a random splitting of the data set into a training and a test set, we tested the ability of the model to predict the retention time of all the compounds. The best predicted/experimental R(2) value was higher than 0.86, while the best Q(2) value we observed was close to 0.84. A comparison of our model with traditional and simpler MLR and PLS regression models shows that KPLS better performs in term of correlation (R(2)), prediction (Q(2)), and support to MetID peak assignment. The KPLS model succeeded in two real-life MetID tasks by correctly predicting elution order of Phase I metabolites, including isomeric monohydroxylated compounds. We also show in this paper that the model's predictive power can be extended to different gradient profiles, by simple mathematical extrapolation using a known equation, thus offering very broad flexibility. Moreover, the current study includes a deep investigation of different types of chemical descriptors used to build the structure-retention relationship.

  11. Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE; Version 1.0): web-based tools to assess the impact of sea level rise in south Florida

    Science.gov (United States)

    Hearn, Paul; Strong, David; Swain, Eric; Decker, Jeremy

    2013-01-01

    South Florida's Greater Everglades area is particularly vulnerable to sea level rise, due to its rich endowment of animal and plant species and its heavily populated urban areas along the coast. Rising sea levels are expected to have substantial impacts on inland flooding, the depth and extent of surge from coastal storms, the degradation of water supplies by saltwater intrusion, and the integrity of plant and animal habitats. Planners and managers responsible for mitigating these impacts require advanced tools to help them more effectively identify areas at risk. The U.S. Geological Survey's (USGS) Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE) Web site has been developed to address these needs by providing more convenient access to projections from models that forecast the effects of sea level rise on surface water and groundwater, the extent of surge and resulting economic losses from coastal storms, and the distribution of habitats. IMMAGE not only provides an advanced geographic information system (GIS) interface to support decision making, but also includes topic-based modules that explain and illustrate key concepts for nontechnical users. The purpose of this report is to familiarize both technical and nontechnical users with the IMMAGE Web site and its various applications.

  12. Microsystem design framework based on tool adaptations and library developments

    Science.gov (United States)

    Karam, Jean Michel; Courtois, Bernard; Rencz, Marta; Poppe, Andras; Szekely, Vladimir

    1996-09-01

    Besides foundry facilities, Computer-Aided Design (CAD) tools are also required to move microsystems from research prototypes to an industrial market. This paper describes a Computer-Aided-Design Framework for microsystems, based on selected existing software packages adapted and extended for microsystem technology, assembled with libraries where models are available in the form of standard cells described at different levels (symbolic, system/behavioral, layout). In microelectronics, CAD has already attained a highly sophisticated and professional level, where complete fabrication sequences are simulated and the device and system operation is completely tested before manufacturing. In comparison, the art of microsystem design and modelling is still in its infancy. However, at least for the numerical simulation of the operation of single microsystem components, such as mechanical resonators, thermo-elements, elastic diaphragms, reliable simulation tools are available. For the different engineering disciplines (like electronics, mechanics, optics, etc) a lot of CAD-tools for the design, simulation and verification of specific devices are available, but there is no CAD-environment within which we could perform a (micro-)system simulation due to the different nature of the devices. In general there are two different approaches to overcome this limitation: the first possibility would be to develop a new framework tailored for microsystem-engineering. The second approach, much more realistic, would be to use the existing CAD-tools which contain the most promising features, and to extend these tools so that they can be used for the simulation and verification of microsystems and of the devices involved. These tools are assembled with libraries in a microsystem design environment allowing a continuous design flow. The approach is driven by the wish to make microsystems accessible to a large community of people, including SMEs and non-specialized academic institutions.

  13. Web-Based Tools in Education

    Directory of Open Access Journals (Sweden)

    Lupasc Adrian

    2016-07-01

    Full Text Available Technology is advancing at a rapid pace, and what we knew a year ago is likely to no longer apply today. With it, the technology brings new ways of transmitting information, machining and processing, storage and socializing. The continuous development of information technologies contributes more than ever to the increase of access to information for any field of activity, including education. For this reason, education must help young people (pupils and students to collect and select from the sheer volume of information available, to access them and learn how to use them. Therefore, education must constantly adapt to social change; it must pass on the achievements and richness of human experience. At the same time, technology supports didactic activity because it leads learning beyond the classroom, involving all actors in the school community and prepares young people for their profession. Moreover, web tools available for education can yield added benefits, which is why, especially at higher levels of the education system, their integration starts being more obvious and the results are soon to be seen. Moreover, information technologies produce changes in the classic way of learning, thus suffering rapid and profound transformations. In addition, current information technologies offer many types of applications, representing the argument for a new system of providing education and for building knowledge. In this regard, the paper aims to highlight the impact and benefits of current information technologies, particularly web-based, on the educational process.

  14. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  15. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  16. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology......, analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...

  17. A Generic Individual-Based Spatially Explicit Model as a Novel Tool for Investigating Insect-Plant Interactions: A Case Study of the Behavioural Ecology of Frugivorous Tephritidae.

    Directory of Open Access Journals (Sweden)

    Ming Wang

    Full Text Available Computational modelling of mechanisms underlying processes in the real world can be of great value in understanding complex biological behaviours. Uptake in general biology and ecology has been rapid. However, it often requires specific data sets that are overly costly in time and resources to collect. The aim of the current study was to test whether a generic behavioural ecology model constructed using published data could give realistic outputs for individual species. An individual-based model was developed using the Pattern-Oriented Modelling (POM strategy and protocol, based on behavioural rules associated with insect movement choices. Frugivorous Tephritidae (fruit flies were chosen because of economic significance in global agriculture and the multiple published data sets available for a range of species. The Queensland fruit fly (Qfly, Bactrocera tryoni, was identified as a suitable individual species for testing. Plant canopies with modified architecture were used to run predictive simulations. A field study was then conducted to validate our model predictions on how plant architecture affects fruit flies' behaviours. Characteristics of plant architecture such as different shapes, e.g., closed-canopy and vase-shaped, affected fly movement patterns and time spent on host fruit. The number of visits to host fruit also differed between the edge and centre in closed-canopy plants. Compared to plant architecture, host fruit has less contribution to effects on flies' movement patterns. The results from this model, combined with our field study and published empirical data suggest that placing fly traps in the upper canopy at the edge should work best. Such a modelling approach allows rapid testing of ideas about organismal interactions with environmental substrates in silico rather than in vivo, to generate new perspectives. Using published data provides a saving in time and resources. Adjustments for specific questions can be achieved by

  18. Neural Networks for Hydrological Modeling Tool for Operational Purposes

    Science.gov (United States)

    Bhatt, Divya; Jain, Ashu

    2010-05-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models

  19. Reliability-Based Electronics Shielding Design Tools

    Science.gov (United States)

    Wilson, J. W.; O'Neill, P. J.; Zang, T. A.; Pandolf, J. E.; Tripathi, R. K.; Koontz, Steven L.; Boeder, P.; Reddell, B.; Pankop, C.

    2007-01-01

    Shielding design on large human-rated systems allows minimization of radiation impact on electronic systems. Shielding design tools require adequate methods for evaluation of design layouts, guiding qualification testing, and adequate follow-up on final design evaluation.

  20. T:XML: A Tool Supporting User Interface Model Transformation

    Science.gov (United States)

    López-Jaquero, Víctor; Montero, Francisco; González, Pascual

    Model driven development of user interfaces is based on the transformation of an abstract specification into the final user interface the user will interact with. The design of transformation rules to carry out this transformation process is a key issue in any model-driven user interface development approach. In this paper, we introduce T:XML, an integrated development environment for managing, creating and previewing transformation rules. The tool supports the specification of transformation rules by using a graphical notation that works on the basis of the transformation of the input model into a graph-based representation. T:XML allows the design and execution of transformation rules in an integrated development environment. Furthermore, the designer can also preview how the generated user interface looks like after the transformations have been applied. These previewing capabilities can be used to quickly create prototypes to discuss with the users in user-centered design methods.

  1. A data-based conservation planning tool for Florida panthers

    Science.gov (United States)

    Murrow, Jennifer L.; Thatcher, Cindy A.; Van Manen, Frank T.; Clark, Joseph D.

    2013-01-01

    Habitat loss and fragmentation are the greatest threats to the endangered Florida panther (Puma concolor coryi). We developed a data-based habitat model and user-friendly interface so that land managers can objectively evaluate Florida panther habitat. We used a geographic information system (GIS) and the Mahalanobis distance statistic (D2) to develop a model based on broad-scale landscape characteristics associated with panther home ranges. Variables in our model were Euclidean distance to natural land cover, road density, distance to major roads, human density, amount of natural land cover, amount of semi-natural land cover, amount of permanent or semi-permanent flooded area–open water, and a cost–distance variable. We then developed a Florida Panther Habitat Estimator tool, which automates and replicates the GIS processes used to apply the statistical habitat model. The estimator can be used by persons with moderate GIS skills to quantify effects of land-use changes on panther habitat at local and landscape scales. Example applications of the tool are presented.

  2. Modeling and Simulation Tools for Heavy Lift Airships

    Science.gov (United States)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  3. Fuzzy regression modeling for tool performance prediction and degradation detection.

    Science.gov (United States)

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  4. Evaluation of air pollution modelling tools as environmental engineering courseware.

    Science.gov (United States)

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results.

  5. A quality assessment tool for markup-based clinical guidelines.

    Science.gov (United States)

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a tool for quality assessment of procedural and declarative knowledge. We developed this tool for evaluating the specification of mark-up-based clinical GLs. Using this graphical tool, the expert physician and knowledge engineer collaborate to perform scoring, using pre-defined scoring scale, each of the knowledge roles of the mark-ups, comparing it to a gold standard. The tool enables scoring the mark-ups simultaneously at different sites by different users at different locations.

  6. Image-Based Computational Fluid Dynamics in Blood Vessel Models: Toward Developing a Prognostic Tool to Assess Cardiovascular Function Changes in Prolonged Space Flights

    Science.gov (United States)

    Chatzimavroudis, George P.; Spirka, Thomas A.; Setser, Randolph M.; Myers, Jerry G.

    2004-01-01

    One of NASA's objectives is to be able to perform a complete, pre-flight, evaluation of cardiovascular changes in astronauts scheduled for prolonged space missions. Computational fluid dynamics (CFD) has shown promise as a method for estimating cardiovascular function during reduced gravity conditions. For this purpose, MRI can provide geometrical information, to reconstruct vessel geometries, and measure all spatial velocity components, providing location specific boundary conditions. The objective of this study was to investigate the reliability of MRI-based model reconstruction and measured boundary conditions for CFD simulations. An aortic arch model and a carotid bifurcation model were scanned in a 1.5T Siemens MRI scanner. Axial MRI acquisitions provided images for geometry reconstruction (slice thickness 3 and 5 mm; pixel size 1x1 and 0.5x0.5 square millimeters). Velocity acquisitions provided measured inlet boundary conditions and localized three-directional steady-flow velocity data (0.7-3.0 L/min). The vessel walls were isolated using NIH provided software (ImageJ) and lofted to form the geometric surface. Constructed and idealized geometries were imported into a commercial CFD code for meshing and simulation. Contour and vector plots of the velocity showed identical features between the MRI velocity data, the MRI-based CFD data, and the idealized-geometry CFD data, with less than 10% differences in the local velocity values. CFD results on models reconstructed from different MRI resolution settings showed insignificant differences (less than 5%). This study illustrated, quantitatively, that reliable CFD simulations can be performed with MRI reconstructed models and gives evidence that a future, subject-specific, computational evaluation of the cardiovascular system alteration during space travel is feasible.

  7. Hierarchical Cluster-based Partial Least Squares Regression (HC-PLSR is an efficient tool for metamodelling of nonlinear dynamic models

    Directory of Open Access Journals (Sweden)

    Omholt Stig W

    2011-06-01

    Full Text Available Abstract Background Deterministic dynamic models of complex biological systems contain a large number of parameters and state variables, related through nonlinear differential equations with various types of feedback. A metamodel of such a dynamic model is a statistical approximation model that maps variation in parameters and initial conditions (inputs to variation in features of the trajectories of the state variables (outputs throughout the entire biologically relevant input space. A sufficiently accurate mapping can be exploited both instrumentally and epistemically. Multivariate regression methodology is a commonly used approach for emulating dynamic models. However, when the input-output relations are highly nonlinear or non-monotone, a standard linear regression approach is prone to give suboptimal results. We therefore hypothesised that a more accurate mapping can be obtained by locally linear or locally polynomial regression. We present here a new method for local regression modelling, Hierarchical Cluster-based PLS regression (HC-PLSR, where fuzzy C-means clustering is used to separate the data set into parts according to the structure of the response surface. We compare the metamodelling performance of HC-PLSR with polynomial partial least squares regression (PLSR and ordinary least squares (OLS regression on various systems: six different gene regulatory network models with various types of feedback, a deterministic mathematical model of the mammalian circadian clock and a model of the mouse ventricular myocyte function. Results Our results indicate that multivariate regression is well suited for emulating dynamic models in systems biology. The hierarchical approach turned out to be superior to both polynomial PLSR and OLS regression in all three test cases. The advantage, in terms of explained variance and prediction accuracy, was largest in systems with highly nonlinear functional relationships and in systems with positive feedback

  8. Tool for physics beyond the standard model

    Science.gov (United States)

    Newby, Christopher A.

    The standard model (SM) of particle physics is a well studied theory, but there are hints that the SM is not the final story. What the full picture is, no one knows, but this thesis looks into three methods useful for exploring a few of the possibilities. To begin I present a paper by Spencer Chang, Nirmal Raj, Chaowaroj Wanotayaroj, and me, that studies the Higgs boson. The scalar particle first seen in 2012 may be the vanilla SM version, but there is some evidence that its couplings are different than predicted. By means of increasing the Higgs' coupling to vector bosons and fermions, we can be more consistent with the data. Next, in a paper by Spencer Chang, Gabriel Barello, and me, we elaborate on a tool created to study dark matter (DM) direct detection. The original work by Anand. et al. focused on elastic dark matter, whereas we extended this work to include the in elastic case, where different DM mass states enter and leave the collision. We also examine several direct detection experiments with our new framework to see if DAMA's modulation can be explained while avoiding the strong constraints imposed by the other experiments. We find that there are several operators that can do this. Finally, in a paper by Spencer Chang, Gabriel Barello, and me, we study an interesting phenomenon know as kinetic mixing, where two gauge bosons can share interactions with particles even though these particles aren't charged under both gauge groups. This, in and of itself, is not new, but we discuss a different method of obtaining this mixing where instead of mixing between two Abelian groups one of the groups is Nonabelian. Using this we then see that there is an inherent mass scale in the mixing strength; something that is absent in the Abelian-Abelian case. Furthermore, if the Nonabelian symmetry is the SU(2)L of the SM then the mass scale of the physics responsible for the mixing is about 1 TeV, right around the sweet spot for detection at the LHC. This dissertation

  9. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    Science.gov (United States)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  10. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  11. Advanced Reach Tool (ART): development of the mechanistic model.

    Science.gov (United States)

    Fransman, Wouter; Van Tongeren, Martie; Cherrie, John W; Tischer, Martin; Schneider, Thomas; Schinkel, Jody; Kromhout, Hans; Warren, Nick; Goede, Henk; Tielemans, Erik

    2011-11-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. The ART mechanistic model is based on a conceptual framework that adopts a source receptor approach, which describes the transport of a contaminant from the source to the receptor and defines seven independent principal modifying factors: substance emission potential, activity emission potential, localized controls, segregation, personal enclosure, surface contamination, and dispersion. ART currently differentiates between three different exposure types: vapours, mists, and dust (fumes, fibres, and gases are presently excluded). Various sources were used to assign numerical values to the multipliers to each modifying factor. The evidence used to underpin this assessment procedure was based on chemical and physical laws. In addition, empirical data obtained from literature were used. Where this was not possible, expert elicitation was applied for the assessment procedure. Multipliers for all modifying factors were peer reviewed by leading experts from industry, research institutes, and public authorities across the globe. In addition, several workshops with experts were organized to discuss the proposed exposure multipliers. The mechanistic model is a central part of the ART tool and with advancing knowledge on exposure, determinants will require updates and refinements on a continuous basis, such as the effect of worker behaviour on personal exposure, 'best practice' values that describe the maximum achievable effectiveness of control measures, the intrinsic emission potential of various solid objects (e.g. metal, glass, plastics, etc.), and extending the applicability domain to certain types of exposures (e.g. gas, fume, and fibre exposure).

  12. Thread Group Multithreading: Accelerating the Computation of an Agent-Based Power System Modeling and Simulation Tool -- C GridLAB-D

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chassin, David P.

    2014-01-06

    GridLAB-DTM is an open source next generation agent-based smart-grid simulator that provides unprecedented capability to model the performance of smart grid technologies. Over the past few years, GridLAB-D has been used to conduct important analyses of smart grid concepts, but it is still quite limited by its computational performance. In order to break through the performance bottleneck to meet the need for large scale power grid simulations, we develop a thread group mechanism to implement highly granular multithreaded computation in GridLAB-D. We achieve close to linear speedups on multithreading version compared against the single-thread version of the same code running on general purpose multi-core commodity for a benchmark simple house model. The performance of the multithreading code shows favorable scalability properties and resource utilization, and much shorter execution time for large-scale power grid simulations.

  13. Right approach to 3D modeling using CAD tools

    Science.gov (United States)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  14. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa (κ): 0

  15. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper;

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  16. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. T

  17. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  18. Suggestion: Human Factor Based User Interface Design Tool

    OpenAIRE

    S.Q. Abbas,; Rizwan Beg; Shahnaz Fatima

    2011-01-01

    In this paper, we introduce HFBUIT, Human Factor based user interface tool that enables designers and engineers to create human factor based user interface. This tool will help the designer to utilize the knowledge about the user to configure the interface for different users, i.e. each user may have different skills, level of experience, or cognitive and physical disabilities. The tool makes it easy to knowhuman factors & to reduce the number of usability problems. HFBUIT can be used in real...

  19. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  20. Research on model of tool edge preparation based on micro abrasive water jet%基于微磨料水射流的刀具钝化模型研究

    Institute of Scientific and Technical Information of China (English)

    万庆丰; 雷玉勇; 刘克福; 陈忠敏; 潘峥正

    2013-01-01

    应用微磨料水射流技术对刀具钝化进行研究,验证了微磨料水射流用于刀具钝化的可行性,应用MikroCAD三维光学刀具测量仪获得刀具切削刃的钝化圆角半径以及微观结构,发现刀具切削刃上不同截面的钝化圆角分布不均匀。基于人工神经网络方法,建立微磨料水射流刀具钝化圆角半径的BP神经网络模型。研究结果表明, BP神经网络能够有效地预测刀具钝化圆角半径,预测值与试验测量值的相对误差为0.63%~4.60%。%The research on tool edge preparation using micro abrasive water jet was carried out .It is showed that preparation of tool edge based on micro abrasive water jet is practicable .The radius of cutting edge roundness and the microstructure were ob-tained using MikroCAD optical 3D measurement system .The experimental results show that the roundness radius is uneven along the tool edge .Based on the method of artificial neural network ,a back propagation neural network model for tool edge preparation process with micro abrasive water jet was established .The result indicates that the roundness radius of tool cutting edge can be well predicted by BP neural network .The relative error between the predicted value and experimental measurements is from 0.63%to 4.60%.

  1. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  2. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  3. Modeling Languages: metrics and assessing tools

    OpenAIRE

    Fonte, Daniela; Boas, Ismael Vilas; Azevedo, José; Peixoto, José João; Faria, Pedro; Silva, Pedro; Sá, Tiago de, 1990-; Costa, Ulisses; da Cruz, Daniela; Henriques, Pedro Rangel

    2012-01-01

    Any traditional engineering field has metrics to rigorously assess the quality of their products. Engineers know that the output must satisfy the requirements, must comply with the production and market rules, and must be competitive. Professionals in the new field of software engineering started a few years ago to define metrics to appraise their product: individual programs and software systems. This concern motivates the need to assess not only the outcome but also the process and tools em...

  4. Many-Task Computing Tools for Multiscale Modeling

    OpenAIRE

    Katz, Daniel S.; Ripeanu, Matei; Wilde, Michael

    2011-01-01

    This paper discusses the use of many-task computing tools for multiscale modeling. It defines multiscale modeling and places different examples of it on a coupling spectrum, discusses the Swift parallel scripting language, describes three multiscale modeling applications that could use Swift, and then talks about how the Swift model is being extended to cover more of the multiscale modeling coupling spectrum.

  5. Scratch as a computational modelling tool for teaching physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  6. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  7. DUK - A Fast and Efficient Kmer Based Sequence Matching Tool

    Energy Technology Data Exchange (ETDEWEB)

    Li, Mingkun; Copeland, Alex; Han, James

    2011-03-21

    A new tool, DUK, is developed to perform matching task. Matching is to find whether a query sequence partially or totally matches given reference sequences or not. Matching is similar to alignment. Indeed many traditional analysis tasks like contaminant removal use alignment tools. But for matching, there is no need to know which bases of a query sequence matches which position of a reference sequence, it only need know whether there exists a match or not. This subtle difference can make matching task much faster than alignment. DUK is accurate, versatile, fast, and has efficient memory usage. It uses Kmer hashing method to index reference sequences and Poisson model to calculate p-value. DUK is carefully implemented in C++ in object oriented design. The resulted classes can also be used to develop other tools quickly. DUK have been widely used in JGI for a wide range of applications such as contaminant removal, organelle genome separation, and assembly refinement. Many real applications and simulated dataset demonstrate its power.

  8. ElEvoHI: a novel CME prediction tool for heliospheric imaging combining an elliptical front with drag-based model fitting

    CERN Document Server

    Rollett, Tanja; Isavnin, Alexey; Davies, Jackie A; Kubicka, Manuel; Amerstorfer, Ute V; Harrison, Richard A

    2016-01-01

    In this study, we present a new method for forecasting arrival times and speeds of coronal mass ejections (CMEs) at any location in the inner heliosphere. This new approach enables the adoption of a highly flexible geometrical shape for the CME front with an adjustable CME angular width and an adjustable radius of curvature of its leading edge, i.e. the assumed geometry is elliptical. Using, as input, STEREO heliospheric imager (HI) observations, a new elliptic conversion (ElCon) method is introduced and combined with the use of drag-based model (DBM) fitting to quantify the deceleration or acceleration experienced by CMEs during propagation. The result is then used as input for the Ellipse Evolution Model (ElEvo). Together, ElCon, DBM fitting, and ElEvo form the novel ElEvoHI forecasting utility. To demonstrate the applicability of ElEvoHI, we forecast the arrival times and speeds of 21 CMEs remotely observed from STEREO/HI and compare them to in situ arrival times and speeds at 1 AU. Compared to the commonl...

  9. An empirical tool to evaluate the safety of cyclists: Community based, macro-level collision prediction models using negative binomial regression.

    Science.gov (United States)

    Wei, Feng; Lovegrove, Gordon

    2013-12-01

    Today, North American governments are more willing to consider compact neighborhoods with increased use of sustainable transportation modes. Bicycling, one of the most effective modes for short trips with distances less than 5km is being encouraged. However, as vulnerable road users (VRUs), cyclists are more likely to be injured when involved in collisions. In order to create a safe road environment for them, evaluating cyclists' road safety at a macro level in a proactive way is necessary. In this paper, different generalized linear regression methods for collision prediction model (CPM) development are reviewed and previous studies on micro-level and macro-level bicycle-related CPMs are summarized. On the basis of insights gained in the exploration stage, this paper also reports on efforts to develop negative binomial models for bicycle-auto collisions at a community-based, macro-level. Data came from the Central Okanagan Regional District (CORD), of British Columbia, Canada. The model results revealed two types of statistical associations between collisions and each explanatory variable: (1) An increase in bicycle-auto collisions is associated with an increase in total lane kilometers (TLKM), bicycle lane kilometers (BLKM), bus stops (BS), traffic signals (SIG), intersection density (INTD), and arterial-local intersection percentage (IALP). (2) A decrease in bicycle collisions was found to be associated with an increase in the number of drive commuters (DRIVE), and in the percentage of drive commuters (DRP). These results support our hypothesis that in North America, with its current low levels of bicycle use (macro-level CPMs. Copyright © 2012. Published by Elsevier Ltd.

  10. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  11. Panalysis: a new spreadsheet-based tool for pandemic planning.

    Science.gov (United States)

    Abramovich, Mark N; Toner, Eric S; Matheny, Jason

    2008-03-01

    Publicly available influenza modeling tools are of limited use to hospitals and local communities in planning for a severe pandemic. We developed Panalysis, a new tool to estimate the likely healthcare consequences of a pandemic and to aid hospitals in the development of mitigation and response strategies. By way of example, we demonstrate how Panalysis can be used to plan for a 1918-like flu pandemic. We discuss potential future applications of this tool.

  12. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  13. The Ising model as a pedagogical tool

    Science.gov (United States)

    Smith, Ryan; Hart, Gus L. W.

    2010-10-01

    Though originally developed to analyze ferromagnetic systems, the Ising model also provides an excellent framework for modeling alloys. The original Ising model represented magnetic moments (up or down) by a +1 or -1 at each point on a lattice and allowed only nearest neighbors interactions to be non-zero. In alloy modeling, the values ±1 represent A and B atoms. The Ising Hamiltonian can be used in a Monte Carlo approach to simulate the thermodynamics of the system (e.g., an order-disorder transition occuring as the temperature is lowered). The simplicity of the model makes it an ideal starting point for a qualitative understanding of magnetism or configuration ordering in a metal. I will demonstrate the application of the Ising model in simple, two-dimensional ferromagnetic systems and alloys.

  14. Mathematical modelling: a tool for hospital infection control.

    Science.gov (United States)

    Grundmann, H; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has emerged that provides solid and testable hypotheses and opens the road to a quantitative assessment of the main obstructions that undermine current efforts to control the spread of health-care-associated infections in hospitals and communities. We aim to explain to a broader audience of professionals in health care, infection control, and health systems administration some of these models that can improve the understanding of the hidden dynamics of health-care-associated infections. We also appraise their usefulness and limitations as an innovative research and decision tool for control purposes.

  15. Ergonomics applications of a mechanical model of the human operator in power hand tool operation.

    Science.gov (United States)

    Lin, Jia-Hua; Radwin, Robert; Nembhard, David

    2005-02-01

    Applications of a new model for predicting power threaded-fastener-driving tool operator response and capacity to react against impulsive torque reaction forces are explored for use in tool selection and ergonomic workplace design. The model is based on a mechanical analog of the human operator, with parameters dependent on work location (horizontal and vertical distances); work orientation (horizontal and vertical); and tool shape (in-line, pistol grip, and right angle); and is stratified by gender. This model enables prediction of group means and variances of handle displacement and force for a given tool configuration. Response percentiles can be ascertained for specific tool operations. For example, a sample pistol grip nutrunner used on a horizontal surface at 30 cm in front of the ankles and 140 cm above the floor results in a predicted mean handle reaction displacement of 39.0 (SD=28.1) mm for males. Consequently 63%of the male users exceed a 30 mm handle displacement limit. When a right angle tool of similar torque output is used instead, the model predicted that only 4.6%of the male tool users exceed a 30 mm handle displacement. A method is described for interpolating individual subject model parameters at any given work location using linear combinations in relation to the range of modeled factors. Additional examples pertinent to ergonomic workstation design and tool selection are provided to demonstrate how the model can be used to aid tool selection and workstation design.

  16. Techniques and tools for efficiently modeling multiprocessor systems

    Science.gov (United States)

    Carpenter, T.; Yalamanchili, S.

    1990-01-01

    System-level tools and methodologies associated with an integrated approach to the development of multiprocessor systems are examined. Tools for capturing initial program structure, automated program partitioning, automated resource allocation, and high-level modeling of the combined application and resource are discussed. The primary language focus of the current implementation is Ada, although the techniques should be appropriate for other programming paradigms.

  17. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  18. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  19. Tools for model-independent bounds in direct dark matter searches

    DEFF Research Database (Denmark)

    Cirelli, M.; Del Nobile, E.; Panci, P.

    2013-01-01

    We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei.......We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei....

  20. Numerical modelling of structure and mechanical properties for medical tools

    Directory of Open Access Journals (Sweden)

    L. Jeziorski

    2007-09-01

    Full Text Available Purpose: In order to design forceps and bowl cutter property, it is necessary to optimise many parameters and consider the functions, which these medical tools should fulfil. Of course, some simplifications are necessary in respect of calculation methodology. In the paper a solution procedure concerning this problem has been presented. The presented solution allows for precise determination of the geometrical dimensions according to the functional requirements that forceps should fulfil. The presented numerical analysis describes a small range of the forceps application but the used algorithm can be applied in any other type of forceps. Also in the paper, the numerical simulation results of the bowl cutter being loaded are presented. Residual stress distribution on the tool surface is presented. A position of the cutting edges and holes carrying away the bone chips is shown as a polar diagram. Design/methodology/approach: The numerical analysis was carried out using ADINA software, based on the finite element method (FEM. In the paper some fundamental construction problems occurring during the design process of the forceps and bowl cutter have been discussed.Findings: The iteration procedures in order to optimize the basic construction parameters of the medical tools (forceps and bowl cutter. The calculations allow for determination of the geometrical parameters with reference to the expected spring rate. The charts elaborated on the basis of the calculations are very useful during a design process. The numerical calculations show an essential problem, namely a change in contact surface as a function of load. The observed phenomenon can affect the functioning of the forceps in e negative way.The numerical simulation make it possible to obtain the suitable geometry, better material properties and the instructions heat treatment of these tools. Research limitations/implications: These research was carried out in order to improve ergonomics

  1. GIS-based hydrogeochemical analysis tools (QUIMET)

    Science.gov (United States)

    Velasco, V.; Tubau, I.; Vázquez-Suñè, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernàndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.

    2014-09-01

    A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

  2. Model atmospheres - Tool for identifying interstellar features

    Science.gov (United States)

    Frisch, P. C.; Slojkowski, S. E.; Rodriguez-Bell, T.; York, D.

    1993-01-01

    Model atmosphere parameters are derived for 14 early A stars with rotation velocities, from optical spectra, in excess of 80 km/s. The models are compared with IUE observations of the stars in regions where interstellar lines are expected. In general, with the assumption of solar abundances, excellent fits are obtained in regions longward of 2580 A, and accurate interstellar equivalent widths can be derived using models to establish the continuum. The fits are poorer at shorter wavelengths, particularly at 2026-2062 A, where the stellar model parameters seem inadequate. Features indicating mass flows are evident in stars with known infrared excesses. In gamma TrA, variability in the Mg II lines is seen over the 5-year interval of these data, and also over timescales as short as 26 days. The present technique should be useful in systematic studies of episodic mass flows in A stars and for stellar abundance studies, as well as interstellar features.

  3. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  4. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  5. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  6. Component-based assistants for MEMS design tools

    Science.gov (United States)

    Hahn, Kai; Brueck, Rainer; Schneider, Christian; Schumer, Christian; Popp, Jens

    2001-04-01

    With this paper a new approach for MEMS design tools will be introduced. An analysis of the design tool market leads to the result that most of the designers work with large and inflexible frameworks. Purchasing and maintaining these frameworks is expensive, and gives no optimum support for MEMS design process. The concept of design assistants, carried out with the concept of interacting software components, denotes a new generation of flexible, small, semi-autonomous software systems that are used to solve specific MEMS design tasks in close interaction with the designer. The degree of interaction depends on the complexity of the design task to be performed and the possibility to formalize the respective knowledge. In this context the Internet as one of today's most important communication media provides support for new tool concepts on the basis of the Java programming language. These modern technologies can be used to set up distributed and platform-independent applications. Thus the idea emerged to implement design assistants using Java. According to the MEMS design model new process sequences have to be defined new for every specific design object. As a consequence, assistants have to be built dynamically depending on the requirements of the design process, what can be achieved with component based software development. Componentware offers the possibility to realize design assistants, in areas like design rule checks, process consistency checks, technology definitions, graphical editors, etc. that may reside distributed over the Internet, communicating via Internet protocols. At the University of Siegen a directory for reusable MEMS components has been created, containing a process specification assistant and a layout verification assistant for lithography based MEMS technologies.

  7. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  8. The Dynamic Model Based on PFC of Asphalt Concrete Cutting Process and Optimization of Tools Installation%基于PFC的沥青混凝土铣削仿真及刀具安装优化

    Institute of Scientific and Technical Information of China (English)

    周里群; 李军; 邢国

    2012-01-01

    In this paper, based on the superiority of the simulation of the discrete element method with the rheological properties on asphalt concrete, it simulates milling machine cutting process of asphalt concrete , overcome the limitations of the macro continuity hypothesis based on traditional continuum mechanics model, and can make this cutting process visible from microcosmic angle. The research results indicated that the model checked by uniaxial compression test, gets change rule of cutting force and friction force in different cutting angle on cutting knife surface of cutting tools, and gets the conclusion that 5~20 degrees cutting angle in installation tools is good to tools' life.%基于离散元对沥青混凝土流变特性模拟的优越性,对其切削过程进行动态仿真,克服了传统连续介质力学模型的宏观连续性假设的局限性,可以从徽观角度对切削过程进行可视化的数值模拟.研究结果表明:通过单轴压缩试验校正后的模型,得到在不同切削角下切削过程中刀具前刀面受力变化规律,并得出刀具在安装工程中优先选择5~20°切削角的结论,为工程实际提供了参考.

  9. Multiway Filtering Based on Multilinear Algebra Tools

    Directory of Open Access Journals (Sweden)

    Salah Bourennane

    2010-03-01

    Full Text Available This paper presents some recent filtering methods based on the lower-rank tensor approximation approach for denoising tensor signals. In this approach, multicomponent data are represented by tensors, that is, multiway arrays, and the presented tensor filtering methods rely on multilinear algebra. First, the classical channel-by-channel SVD-based filtering method is overviewed. Then, an extension of the classical matrix filtering method is presented. It is based on the lower rank- K ,...,Kn  1 truncation of the HOSVD which performsa multimode Principal Component Analysis (PCA and is implicitly developed for an additive white Gaussian noise. Two tensor filtering methods recently developed by the authors are also overviewed. The performances and comparative results between all these tensor filtering methods are presented for the cases of noise reduction in color images.

  10. Multiway Filtering Based on Multilinear Algebra Tools

    Science.gov (United States)

    Bourennane, Salah; Fossati, Caroline

    This paper presents some recent filtering methods based on the lower-rank tensor approximation approach for denoising tensor signals. In this approach, multicomponent data are represented by tensors, that is, multiway arrays, and the presented tensor filtering methods rely on multilinear algebra. First, the classical channel-by-channel SVD-based filtering method is overviewed. Then, an extension of the classical matrix filtering method is presented. It is based on the lower rank-(K 1,...,K N ) truncation of the HOSVD which performs a multimode Principal Component Analysis (PCA) and is implicitly developed for an additive white Gaussian noise. Two tensor filtering methods recently developed by the authors are also overviewed. The performances and comparative results between all these tensor filtering methods are presented for the cases of noise reduction in color images.

  11. Multidisciplinary Modelling Tools for Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad

    This thesis presents multidisciplinary modelling techniques in a Design For Reliability (DFR) approach for power electronic circuits. With increasing penetration of renewable energy systems, the demand for reliable power conversion systems is becoming critical. Since a large part of electricity...... in reliability assessment of power modules, a three-dimensional lumped thermal network is proposed to be used for fast, accurate and detailed temperature estimation of power module in dynamic operation and different boundary conditions. Since an important issue in the reliability of power electronics...... are generic and valid to be used in circuit simulators or any programing software. These models are important building blocks for the reliable design process or performance assessment of power electronic circuits. The models can save time and cost in power electronics packaging and power converter to evaluate...

  12. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  13. Distributing Knight. Using Type-Based Publish/Subscribe for Building Distributed Collaboration Tools

    DEFF Research Database (Denmark)

    Damm, Christian Heide; Hansen, Klaus Marius

    2002-01-01

    Distributed applications are hard to understand, build, and evolve. The need for decoupling, flexibility, and heterogeneity in distributed collaboration tools present particular problems; for such applications, having the right abstractions and primitives for distributed communication becomes even...... more important. We present Distributed Knight, an extension to the Knight tool, for distributed, collaborative, and gesture-based object-oriented modelling. Distributed Knight was built using the type-based publish/subscribe paradigm. Based on this case, we argue that type-based publish....../subscribe provides a natural and effective abstraction for developing distributed collaboration tools....

  14. A community diagnostic tool for chemistry climate model validation

    Directory of Open Access Journals (Sweden)

    A. Gettelman

    2012-09-01

    Full Text Available This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model, and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

  15. Student Model Tools Code Release and Documentation

    DEFF Research Database (Denmark)

    Johnson, Matthew; Bull, Susan; Masci, Drew

    This document contains a wealth of information about the design and implementation of the Next-TELL open learner model. Information is included about the final specification (Section 3), the interfaces and features (Section 4), its implementation and technical design (Section 5) and also a summary...

  16. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2016-06-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  17. Fluid Survival Tool: A Model Checker for Hybrid Petri Nets

    NARCIS (Netherlands)

    Postema, Björn; Remke, Anne; Haverkort, Boudewijn R.; Ghasemieh, Hamed

    2014-01-01

    Recently, algorithms for model checking Stochastic Time Logic (STL) on Hybrid Petri nets with a single general one-shot transition (HPNG) have been introduced. This paper presents a tool for model checking HPNG models against STL formulas. A graphical user interface (GUI) not only helps to demonstra

  18. Natural Languages Processing for Building Computer-based Learning Tools

    Institute of Scientific and Technical Information of China (English)

    张颖; 李娜

    2015-01-01

    This paper outlines a framework to use computer and natural language techniques for various levels of learners to learn foreign languages in Computer-based Learning environment. We propose some ideas for using the computer as a practical tool for learning foreign language where the most of courseware is generated automatically. We then describe how to build Computer-based Learning tools, discuss its effectiveness, and conclude with some possibilities using on-line resources.

  19. Engineering tools for robust creep modelling

    OpenAIRE

    Holmström, Stefan

    2010-01-01

    High temperature creep is often dealt with simplified models to assess and predict the future behavior of materials and components. Also, for most applications the creep properties of interest require costly long-term testing that limits the available data to support design and life assessment. Such test data sets are even smaller for welded joints that are often the weakest links of structures. It is of considerable interest to be able to reliably predict and extrapolate long term creep beha...

  20. Theme E: disabilities: analysis models and tools

    OpenAIRE

    Vigouroux, Nadine; Gorce, Philippe; Roby-Brami, Agnès; Rémi-Néris, Olivier

    2013-01-01

    International audience; This paper presents the topics and the activity of the theme E “disabilities: analysis models and tools” within the GDR STIC Santé. This group has organized a conference and a workshop during the period 2011–2012. The conference has focused on technologies for cognitive, sensory and motor impairments, assessment and use study of assistive technologies, user centered method design and the place of ethics in these research topics. The objective of “bodily integration of ...

  1. Constructing an advanced software tool for planetary atmospheric modeling

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  2. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  3. GOMA: functional enrichment analysis tool based on GO modules

    Science.gov (United States)

    Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213

  4. Knowledge base development for SAM training tools

    Energy Technology Data Exchange (ETDEWEB)

    Jae, M.S.; Yoo, W.S.; Park, S. S.; Choi, H.K. [Hansung Univ., Seoul (Korea)

    2001-03-01

    Severe accident management can be defined as the use of existing and alternative resources, systems, and actions to prevent or mitigate a core-melt accident in nuclear power plants. TRAIN (Training pRogram for AMP In NPP), developed for training control room staff and the technical group, is introduced in this report. The TRAIN composes of phenomenological knowledge base (KB), accident sequence KB and accident management procedures with AM strategy control diagrams and information needs. This TRAIN might contribute to training them by obtaining phenomenological knowledge of severe accidents, understanding plant vulnerabilities, and solving problems under high stress. 24 refs., 76 figs., 102 tabs. (Author)

  5. Rasp Tool on Phoenix Robotic Arm Model

    Science.gov (United States)

    2008-01-01

    This close-up photograph taken at the Payload Interoperability Testbed at the University of Arizona, Tucson, shows the motorized rasp protruding from the bottom of the scoop on the engineering model of NASA's Phoenix Mars Lander's Robotic Arm. The rasp will be placed against the hard Martian surface to cut into the hard material and acquire an icy soil sample for analysis by Phoenix's scientific instruments. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  6. Modeling of tool path for the CNC sheet cutting machines

    Science.gov (United States)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  7. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  8. Standalone visualization tool for three-dimensional DRAGON geometrical models

    Energy Technology Data Exchange (ETDEWEB)

    Lukomski, A.; McIntee, B.; Moule, D.; Nichita, E. [Faculty of Energy Systems and Nuclear Science, Univ. of Ontario Inst. of Tech., Oshawa, Ontario (Canada)

    2008-07-01

    DRAGON is a neutron transport and depletion code able to solve one-, two- and three-dimensional problems. To date DRAGON provides two visualization modules, able to represent respectively two- and three-dimensional geometries. The two-dimensional visualization module generates a postscript file, while the three dimensional visualization module generates a MATLAB M-file with instructions for drawing the tracks in the DRAGON TRACKING data structure, which implicitly provide a representation of the geometry. The current work introduces a new, standalone, tool based on the open-source Visualization Toolkit (VTK) software package which allows the visualization of three-dimensional geometrical models by reading the DRAGON GEOMETRY data structure and generating an axonometric image which can be manipulated interactively by the user. (author)

  9. Microcantilever-based platforms as biosensing tools.

    Science.gov (United States)

    Alvarez, Mar; Lechuga, Laura M

    2010-05-01

    The fast and progressive growth of the biotechnology and pharmaceutical fields forces the development of new and powerful sensing techniques for process optimization and detection of biomolecules at very low concentrations. During the last years, the simplest MEMS structures, i.e. microcantilevers, have become an emerging and promising technology for biosensing applications, due to their small size, fast response, high sensitivity and their compatible integration into "lab-on-a-chip" devices. This article provides an overview of some of the most interesting bio-detections carried out during the last 2-3 years with the microcantilever-based platforms, which highlight the continuous expansion of this kind of sensor in the medical diagnosis field, reaching limits of detection at the single molecule level.

  10. MAST – A Mobile Agent-based Security Tool

    Directory of Open Access Journals (Sweden)

    Marco Carvalho

    2004-08-01

    Full Text Available One of the chief computer security problems is not the long list of viruses and other potential vulnerabilities, but the vast number of systems that continue to be easy prey, as their system administrators or owners simply are not able to keep up with all of the available patches, updates, or needed configuration changes in order to protect them from those known vulnerabilities. Even up-to-date systems could become vulnerable to attacks, due to inappropriate configuration or combined used of applications and services. Our mobile agent-based security tool (MAST is designed to bridge this gap, and provide automated methods to make sure that all of the systems in a specific domain or network are secured and up-to-date with all patches and updates. The tool is also designed to check systems for misconfigurations that make them vulnerable. Additionally, this user interface is presented in a domain knowledge model known as a Concept Map that provides a continuous learning experience for the system administrator.

  11. Space Earthquake Perturbation Simulation (SEPS) an application based on Geant4 tools to model and simulate the interaction between the Earthquake and the particle trapped on the Van Allen belt

    Science.gov (United States)

    Ambroglini, Filippo; Jerome Burger, William; Battiston, Roberto; Vitale, Vincenzo; Zhang, Yu

    2014-05-01

    During last decades, few space experiments revealed anomalous bursts of charged particles, mainly electrons with energy larger than few MeV. A possible source of these bursts are the low-frequency seismo-electromagnetic emissions, which can cause the precipitation of the electrons from the lower boundary of their inner belt. Studies of these bursts reported also a short-term pre-seismic excess. Starting from simulation tools traditionally used on high energy physics we developed a dedicated application SEPS (Space Perturbation Earthquake Simulation), based on the Geant4 tool and PLANETOCOSMICS program, able to model and simulate the electromagnetic interaction between the earthquake and the particles trapped in the inner Van Allen belt. With SEPS one can study the transport of particles trapped in the Van Allen belts through the Earth's magnetic field also taking into account possible interactions with the Earth's atmosphere. SEPS provides the possibility of: testing different models of interaction between electromagnetic waves and trapped particles, defining the mechanism of interaction as also shaping the area in which this takes place,assessing the effects of perturbations in the magnetic field on the particles path, performing back-tracking analysis and also modelling the interaction with electric fields. SEPS is in advanced development stage, so that it could be already exploited to test in details the results of correlation analysis between particle bursts and earthquakes based on NOAA and SAMPEX data. The test was performed both with a full simulation analysis, (tracing from the position of the earthquake and going to see if there were paths compatible with the burst revealed) and with a back-tracking analysis (tracing from the burst detection point and checking the compatibility with the position of associated earthquake).

  12. Chapter 14: Web-based Tools - WESIX

    Science.gov (United States)

    Krughoff, K. S.; Connolly, A. J.

    We present here the design and features of the Web Enabled Source Identifier with X-Matching (WESIX). With the proliferation of large imaging surveys, it has become increasingly apparent that tasks performed frequently by astronomers need to be made available in a web-aware manner. The reasons for this are twofold: First, it is no longer feasible to work with the complete data sets. Calculations are much more efficient if they can be carried out at the data center where large files can be transferred quickly. Second, exploratory science can be greatly facilitated by combining common tasks into integrated web services. WESIX addresses both of these issues. It is deployable to large data centers where source identification can be carried out at the data source. In addition, WESIX can transparently leverage the capabilities of Open SkyQuery to crossmatch with large catalogs. The result is a web-based service that integrates object detection with the ability to crossmatch against published catalog data. In this chapter we will discuss how WESIX is constructed, its functionality and some example usage. Section 1 will give a brief overview of the architecture of the service. Section 2 will introduce the features of the service through both the web browser and SOAP web service interfaces. Section 3 gives a detailed overview of the web service methods. Section 4 walks through the example client distributed with the software package.

  13. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; Destefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  14. Performance Evaluation of Java Based Object Relational Mapping Tools

    Directory of Open Access Journals (Sweden)

    Shoaib Mahmood Bhatti

    2013-04-01

    Full Text Available Object persistency is the hot issue in the form of ORM (Object Relational Mapping tools in industry as developers use these tools during software development. This paper presents the performance evaluation of Java based ORM tools. For this purpose, Hibernate, Ebean and TopLinkhave been selected as the ORM tools which are popular and open source. Their performance has been measured from execution point of view. The results show that ORM tools are the good option for the developers considering the system throughput in shorter setbacks and they can be used efficiently and effectively for performing mapping of the objects into the relational dominated world of database, thus creating a hope for a better and well dominated future of this technology.

  15. Rapid Deployment of Optimal Control for Building HVAC Systems using Innovative Software Tools and a Hybrid Heuristic/Model Based Control Approach

    Science.gov (United States)

    2017-03-21

    BrightBox Optimization Modeling Platform ................................................................. 11 Figure 2. BrightBox Software Architecture and...2. BrightBox Software Architecture and Interaction with Building 12 We recognized the need for a dashboard and real-time savings reports for...account for equipment specifications, chilled water load and flow profile, and the coincident weather data. This program tests all of the possible

  16. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    Science.gov (United States)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  17. Astronomical data fusion tool based on PostgreSQL

    Science.gov (United States)

    Han, Bo; Zhang, Yan-Xia; Zhong, Shou-Bo; Zhao, Yong-Heng

    2016-11-01

    With the application of advanced astronomical technologies, equipments and methods all over the world, astronomical observations cover the range from radio, infrared, visible light, ultraviolet, X-ray and gamma-ray bands, and enter into the era of full wavelength astronomy. How to effectively integrate data from different ground- and space-based observation equipments, different observers, different bands and different observation times, requires data fusion technology. In this paper we introduce a cross-match tool that is developed in the Python language, is based on the PostgreSQL database and uses Q3C as the core index, facilitating the cross-match work of massive astronomical data. It provides four different cross-match functions, namely: (I) cross-match of the custom error range; (II) cross-match of catalog errors; (III) cross-match based on the elliptic error range; (IV) cross-match of the nearest neighbor algorithm. The resulting cross-matched set provides a good foundation for subsequent data mining and statistics based on multiwavelength data. The most advantageous aspect of this tool is a user-oriented tool applied locally by users. By means of this tool, users can easily create their own databases, manage their own data and cross-match databases according to their requirements. In addition, this tool is also able to transfer data from one database into another database. More importantly, it is easy to get started with the tool and it can be used by astronomers without writing any code.

  18. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    's widely used in the world. Watershed models can be characterized by the high number of processes associated simulated. The estimation of these processes is also data intensive, requiring data on topography, land use / land cover, agriculture practices, soil type, precipitation, temperature, relative humidity, wind and radiation. Every year new data is being made available namely by satellite, that has allow to improve the quality of model input and also the calibration of the models (Galvão et. al, 2004b). Tools to cope with the vast amount of data have been developed: data formatting, data retrieving, data bases, metadata bases. The high number of processes simulated in watershed models makes them very wide in terms of output. The SWAT model outputs were modified to produce MOHID compliant result files (time series and HDF). These changes maintained the integrity of the original model, thus guarantying that results remain equal to the original version of SWAT. This allowed to output results in MOHID format, thus making it possible to immediately process it with MOHID visualization and data analysis tools (Chambel-Leitão et. al 2007; Trancoso et. al, 2009). Besides SWAT was modified to produce results files in HDF5 format, this allows the visualization of watershed properties (modeled by SWAT) in animated maps using MOHID GIS. The modified version of SWAT described here has been applied to various national and European projects. Results of the application of this modified version of SWAT to estimate hydrology and nutrients loads to estuaries and water bodies will be shown (Chambel-Leitão, 2008; Yarrow & Chambel-Leitão 2008; Chambel-Leitão et. al 2008; Yarrow & P. Chambel-Leitão, 2007; Yarrow & P. Chambel-Leitão, 2007; Coelho et. al., 2008). Keywords: Watershed models, SWAT, MOHID LAND, Hydrology, Nutrient Loads Arnold, J. G. and Fohrer, N. (2005). SWAT2000: current capabilities and research opportunities in applied watershed modeling. Hydrol. Process. 19, 563

  19. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    OpenAIRE

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes...

  20. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......’s preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation...

  1. Labels affect preschoolers' tool-based scale errors.

    Science.gov (United States)

    Hunley, Samuel B; Hahn, Erin R

    2016-11-01

    Scale errors offer a unique context in which to examine the interdependencies between language and action. Here, we manipulated the presence of labels in a tool-based paradigm previously shown to elicit high rates of scale errors. We predicted that labels would increase children's scale errors with tools by directing attention to shape, function, and category membership. Children between the ages of 2 and 3years were introduced to an apparatus and shown how to produce its function using a tool (e.g., scooping a toy fish from an aquarium using a net). In each of two test trials, children were asked to choose between two novel tools to complete the same task: one that was a large non-functional version of the tool presented in training and one novel functional object (different in shape). A total of four tool-apparatus sets were tested. The results indicated that without labels, scale errors decreased over the two test trials. In contrast, when labels were present, scale errors remained high in the second test trial. We interpret these findings as evidence that linguistic cues can influence children's action-based errors with tools.

  2. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    Science.gov (United States)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well

  3. Development of Nylon Based FDM Filament for Rapid Tooling Application

    Science.gov (United States)

    Singh, R.; Singh, S.

    2014-04-01

    There has been critical need for development of cost effective nylon based wire to be used as feed stock filament for fused deposition modelling (FDM) machine. But hitherto, very less work has been reported for development of alternate solution of acrylonitrile butadiene styrene (ABS) based wire which is presently used in most of FDM machines. The present research work is focused on development of nylon based wire as an alternative of ABS wire (which is to be used as feedstock filament on FDM) without changing any hardware or software of machine. For the present study aluminium oxide (Al2O3) as additive in different proportion has been used with nylon fibre. Single screw extruder was used for wire preparation and wire thus produced was tested on FDM. Mechanical properties i.e. tensile strength and percentage elongation of finally developed wire have been optimized by Taguchi L9 technique. The work represented major development in reducing cost and time in rapid tooling applications.

  4. Testing validation tools on CLIPS-based expert systems

    Science.gov (United States)

    Chang, C. L.; Stachowitz, R. A.; Combs, J. B.

    1991-01-01

    The Expert Systems Validation Associate (EVA) is a validation system which was developed at the Lockheed Software Technology Center and Artificial Intelligence Center between 1986 and 1990. EVA is an integrated set of generic tools to validate any knowledge-based system written in any expert system shell such as C Language Integrated Production System (CLIPS), ART, OPS5, KEE, and others. Many validation tools have been built in the EVA system. In this paper, we describe the testing results of applying the EVA validation tools to the Manned Maneuvering Unit (MMU) Fault Diagnosis, Isolation, and Reconfiguration (FDIR) expert system, written in CLIPS, obtained from the NASA Johnson Space Center.

  5. Skill Transfer and Virtual Training for IND Response Decision-Making: Models for Government-Industry Collaboration for the Development of Game-Based Training Tools

    Science.gov (United States)

    2016-05-05

    that many entertainment games that are played purely for “fun” can involve managing massive operations research supply chains , balancing multi-object...more accepted and validated as methods for augmenting and modernizing training and evaluation. Emergency management training programs wishing to...the research and training communities [1, 2, 3, 4]. Game-based training for emergency management professionals can complement traditional training

  6. Application of Krylov Reduction Technique for a Machine Tool Multibody Modelling

    Directory of Open Access Journals (Sweden)

    M. Sulitka

    2014-02-01

    Full Text Available Quick calculation of machine tool dynamic response represents one of the major requirements for machine tool virtual modelling and virtual machining, aiming at simulating the machining process performance, quality, and precision of a workpiece. Enhanced time effectiveness in machine tool dynamic simulations may be achieved by employing model order reduction (MOR techniques of the full finite element (FE models. The paper provides a case study aimed at comparison of Krylov subspace base and mode truncation technique. Application of both of the reduction techniques for creating a machine tool multibody model is evaluated. The Krylov subspace reduction technique shows high quality in terms of both dynamic properties of the reduced multibody model and very low time demands at the same time.

  7. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  8. A Tool for Auditing Standards-Based Education.

    Science.gov (United States)

    Dianda, Marcella; McKeon, Denise; Kapinus, Barbara

    The National Education Association developed this audit tool for interested affiliates to use to assess standards-based education in their states by gathering and analyzing information about its implementation and to array the information they gather against a set of standards that can help ensure that standards-based education fulfills its…

  9. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  10. Tool Support for Collaborative Teaching and Learning of Object-Oriented Modelling

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ratzer, Anne Vinter

    2002-01-01

    Modeling is central to doing and learning object-oriented development. We present a new tool, Ideogramic UML, for gesture-based collaborative modeling with the Unified Modeling Language (UML), which can be used to collaboratively teach and learn modeling. Furthermore, we discuss how we have...... effectively used Ideogramic UML to teach object-oriented modeling and the UML to groups of students using the UML for project assignments....

  11. A New Tool Wear Monitoring Method Based on Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Qianjian Guo

    2013-06-01

    Full Text Available Tool wear prediction is a major contributor to the dimensional errors of a work piece in precision machining, which plays an important role in industry for higher productivity and product quality. Tool wear monitoring is an effective way to predict the tool wear loss in milling process. In this paper, a new bionic prediction model is presented based on the generation mechanism of tool wear loss. Different milling conditions are estimated as the input variables, tool wear loss is estimated as the output variable, neural network method is proposed to establish the mapping relation and ant algorithm is used to train the weights of BP neural networks during tool wear modeling. Finally, a real-time tool wear loss estimator is developed based on ant colony alogrithm and experiments have been conducted for measuring tool wear based on the estimator in a milling machine. The experimental and estimated results are found to be in satisfactory agreement with average error lower than 6%.

  12. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  13. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  14. Reduction of inequalities in health: assessing evidence-based tools

    Directory of Open Access Journals (Sweden)

    Shea Beverley

    2006-09-01

    Full Text Available Abstract Background The reduction of health inequalities is a focus of many national and international health organisations. The need for pragmatic evidence-based approaches has led to the development of a number of evidence-based equity initiatives. This paper describes a new program that focuses upon evidence- based tools, which are useful for policy initiatives that reduce inequities. Methods This paper is based on a presentation that was given at the "Regional Consultation on Policy Tools: Equity in Population Health Reports," held in Toronto, Canada in June 2002. Results Five assessment tools were presented. 1. A database of systematic reviews on the effects of educational, legal, social, and health interventions to reduce unfair inequalities is being established through the Cochrane and Campbell Collaborations. 2 Decision aids and shared decision making can be facilitated in disadvantaged groups by 'health coaches' to help people become better decision makers, negotiators, and navigators of the health system; a pilot study in Chile has provided proof of this concept. 3. The CIET Cycle: Combining adapted cluster survey techniques with qualitative methods, CIET's population based applications support evidence-based decision making at local and national levels. The CIET map generates maps directly from survey or routine institutional data, to be used as evidence-based decisions aids. Complex data can be displayed attractively, providing an important tool for studying and comparing health indicators among and between different populations. 4. The Ottawa Equity Gauge is applying the Global Equity Gauge Alliance framework to an industrialised country setting. 5 The Needs-Based Health Assessment Toolkit, established to assemble information on which clinical and health policy decisions can be based, is being expanded to ensure a focus on distribution and average health indicators. Conclusion Evidence-based planning tools have much to offer the

  15. Modeling of Tool Wear in Vibration Assisted Nano Impact-Machining by Loose Abrasives

    Directory of Open Access Journals (Sweden)

    Sagil James

    2014-01-01

    Full Text Available Vibration assisted nano impact-machining by loose abrasives (VANILA is a novel nanomachining process that combines the principles of vibration assisted abrasive machining and tip-based nanomachining, to perform target specific nanoabrasive machining of hard and brittle materials. An atomic force microscope (AFM is used as a platform in this process wherein nanoabrasives, injected in slurry between the workpiece and the vibrating AFM probe which is the tool, impact the workpiece and cause nanoscale material removal. The VANILA process are conducted such that the tool tip does not directly contact the workpiece. The level of precision and quality of the machined features in a nanomachining process is contingent on the tool wear which is inevitable. Initial experimental studies have demonstrated reduced tool wear in the VANILA process as compared to indentation process in which the tool directly contacts the workpiece surface. In this study, the tool wear rate during the VANILA process is analytically modeled considering impacts of abrasive grains on the tool tip surface. Experiments are conducted using several tools in order to validate the predictions of the theoretical model. It is seen that the model is capable of accurately predicting the tool wear rate within 10% deviation.

  16. Research and design on a kind of CASE environment architectural style based on ToolBus

    Institute of Scientific and Technical Information of China (English)

    Guo Bing; Shen Yan; Xie Jun; Wang Yong; Xiong Guangze

    2005-01-01

    Because CASE (computer aided software engineering) environment is a kind of complex system software, its software architecture is very important. From the viewpoint of software architecture, this paper first presents TBus architectural style, which is a kind of CASE environment architectural style based on ToolBus, then describes the architectural model and system's behavior in formal method, researches and analyzes the corresponding tool structural model. Last the paper implements a TBus architectural instance-LambdaBridge, which proves the validity of ToolBus and TBus architectural styles.

  17. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  18. Implementing the Mother-Baby Model of Nursing Care Using Models and Quality Improvement Tools.

    Science.gov (United States)

    Brockman, Vicki

    As family-centered care has become the expected standard, many facilities follow the mother-baby model, in which care is provided to both a woman and her newborn in the same room by the same nurse. My facility employed a traditional model of nursing care, which was not evidence-based or financially sustainable. After implementing the mother-baby model, we experienced an increase in exclusive breastfeeding rates at hospital discharge, increased patient satisfaction, improved staff productivity and decreased salary costs, all while the number of births increased. Our change was successful because it was guided by the use of quality improvement tools, change theory and evidence-based practice models. © 2015 AWHONN.

  19. Homology modeling: an important tool for the drug discovery.

    Science.gov (United States)

    França, Tanos Celmar Costa

    2015-01-01

    In the last decades, homology modeling has become a popular tool to access theoretical three-dimensional (3D) structures of molecular targets. So far several 3D models of proteins have been built by this technique and used in a great diversity of structural biology studies. But are those models consistent enough with experimental structures to make this technique an effective and reliable tool for drug discovery? Here we present, briefly, the fundamentals and current state-of-the-art of the homology modeling techniques used to build 3D structures of molecular targets, which experimental structures are not available in databases, and list some of the more important works, using this technique, available in literature today. In many cases those studies have afforded successful models for the drug design of more selective agonists/antagonists to the molecular targets in focus and guided promising experimental works, proving that, when the appropriate templates are available, useful models can be built using some of the several software available today for this purpose. Limitations of the experimental techniques used to solve 3D structures allied to constant improvements in the homology modeling software will maintain the need for theoretical models, establishing the homology modeling as a fundamental tool for the drug discovery.

  20. The Culture Based Model: Constructing a Model of Culture

    Science.gov (United States)

    Young, Patricia A.

    2008-01-01

    Recent trends reveal that models of culture aid in mapping the design and analysis of information and communication technologies. Therefore, models of culture are powerful tools to guide the building of instructional products and services. This research examines the construction of the culture based model (CBM), a model of culture that evolved…

  1. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  2. Physical Modeling of Contact Processes on the Cutting Tools Surfaces of STM When Turning

    Science.gov (United States)

    Belozerov, V. A.; Uteshev, M. H.

    2016-08-01

    This article describes how to create an optimization model of the process of fine turning of superalloys and steel tools from STM on CNC machines, flexible manufacturing units (GPM), machining centers. Creation of the optimization model allows you to link (unite) contact processes simultaneously on the front and back surfaces of the tool from STM to manage contact processes and the dynamic strength of the cutting tool at the top of the STM. Established optimization model of management of the dynamic strength of the incisors of the STM in the process of fine turning is based on a previously developed thermomechanical (physical, heat) model, which allows the system thermomechanical approach to choosing brands STM (domestic and foreign) for cutting tools from STM designed for fine turning of heat resistant alloys and steels.

  3. A general thermal model of machine tool spindle

    Directory of Open Access Journals (Sweden)

    Yanfang Dong

    2017-01-01

    Full Text Available As the core component of machine tool, the thermal characteristics of the spindle have a significant influence on machine tool running status. Lack of an accurate model of the spindle system, particularly the model of load–deformation coefficient between the bearing rolling elements and rings, severely limits the thermal error analytic precision of the spindle. In this article, bearing internal loads, especially the function relationships between the principal curvature difference F(ρ and auxiliary parameter nδ, semi-major axis a, and semi-minor axis b, have been determined; furthermore, high-precision heat generation combining the heat sinks in the spindle system is calculated; finally, an accurate thermal model of the spindle was established. Moreover, a conventional spindle with embedded fiber Bragg grating temperature sensors has been developed. By comparing the experiment results with simulation, it indicates that the model has good accuracy, which verifies the reliability of the modeling process.

  4. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  5. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    , Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...... signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, "Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems" for the period 2005 to 2007. The main activity is semi-annual...... workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops...

  6. BIM-based deconstruction tool: Towards essential functionalities

    Directory of Open Access Journals (Sweden)

    Olugbenga O. Akinade

    2017-06-01

    Full Text Available This study discusses the future directions of effective Design for Deconstruction (DfD using BIM-based approach to design coordination. After a review of extant literatures on existing DfD practices and tools, it became evident that none of the tools is BIM compliant and that BIM implementation has been ignored for end-of-life activities. To understand how BIM could be employed for DfD and to identify essential functionalities for a BIM-based deconstruction tool, Focus Group Interviews (FGIs were conducted with professionals who have utilised BIM on their projects. The interview transcripts of the FGIs were analysed using descriptive interpretive analysis to identify common themes based on the experiences of the participants. The themes highlight functionalities of BIM in driving effective DfD process, which include improved collaboration among stakeholders, visualisation of deconstruction process, identification of recoverable materials, deconstruction plan development, performance analysis and simulation of end-of-life alternatives, improved building lifecycle management, and interoperability with existing BIM software. The results provide the needed technological support for developing tools for BIM compliant DfD tools.

  7. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  8. JTorX: A Tool for On-Line Model-Driven Test Derivation and Execution

    NARCIS (Netherlands)

    Belinfante, Axel; Esparza, Javier; Majumdar, Rupak

    We introduce JTorX, a tool for model-driven test derivation and execution, based on the ioco theory. This theory, originally presented in [Tretmans,1996], has been refined in [Tretmans,2008] with test-cases that are input-enabled. For models with underspecified traces [vdBijl+,2004] introduced

  9. MACHINING OF NICKEL BASED ALLOYS USING DIFFERENT CEMENTED CARBIDE TOOLS

    Directory of Open Access Journals (Sweden)

    BASIM A. KHIDHIR

    2010-09-01

    Full Text Available This paper presents the results of experimental work in dry turning of nickel based alloys (Haynes – 276 using Deferent tool geometer of cemented carbide tools. The turning tests were conducted at three different cutting speeds (112, 152, 201and 269 m/min while feed rate and depth of cut were kept constant at 0.2 mm/rev and 1.5 mm, respectively. The tool holders used were SCLCR with insert CCMT-12 and CCLNR – M12-4 with insert CNGN-12. The influence of cutting speed, tool inserts type and workpiece material was investigated on the machined surface roughness. The worn parts of the cutting tools were also examined under scanning electron microscope (SEM. The results showed that cutting speed significantly affected the machined surface finish values in related with the tool insert geometry. Insert type CCMT-12 showed better surface finish for cutting speed to 201 m/min, while insert type CNGN-12 surface roughness increased dramatically with increasing of speed to a limit completely damage of insert geometer beyond 152 m/min.

  10. The synergy professional practice model and its patient characteristics tool: a staff empowerment strategy.

    Science.gov (United States)

    MacPhee, Maura; Wardrop, Andrea; Campbell, Cheryl; Wejr, Patricia

    2011-10-01

    Nurse leaders can positively influence practice environments through a number of empowerment strategies, among them professional practice models. These models encompass the philosophy, structures and processes that support nurses' control over their practice and their voice within healthcare organizations. Nurse-driven professional practice models can serve as a framework for collaborative decision-making among nursing and other staff. This paper describes a provincewide pilot project in which eight nurse-led project teams in four healthcare sectors worked with the synergy professional practice model and its patient characteristics tool. The teams learned how the model and tool can be used to classify patients' acuity levels and make staffing assignments based on a "best fit" between patient needs and staff competencies. The patient characteristics tool scores patients' acuities on eight characteristics such as stability, vulnerability and resource availability. This tool can be used to make real-time patient assessments. Other potential applications for the model and tool are presented, such as care planning, team-building and determining appropriate staffing levels. Our pilot project evidence suggests that the synergy model and its patient characteristics tool may be an empowerment strategy that nursing leaders can use to enhance their practice environments.

  11. Toxicokinetic models and related tools in environmental risk assessment of chemicals.

    Science.gov (United States)

    Grech, Audrey; Brochot, Céline; Dorne, Jean-Lou; Quignot, Nadia; Bois, Frédéric Y; Beaudouin, Rémy

    2017-02-01

    Environmental risk assessment of chemicals for the protection of ecosystems integrity is a key regulatory and scientific research field which is undergoing constant development in modelling approaches and harmonisation with human risk assessment. This review focuses on state-of-the-art toxicokinetic tools and models that have been applied to terrestrial and aquatic species relevant to environmental risk assessment of chemicals. Both empirical and mechanistic toxicokinetic models are discussed using the results of extensive literature searches together with tools and software for their calibration and an overview of applications in environmental risk assessment. These include simple tools such as one-compartment models, multi-compartment models to physiologically-based toxicokinetic (PBTK) models, mostly available for aquatic species such as fish species and a number of chemical classes including plant protection products, metals, persistent organic pollutants, nanoparticles. Data gaps and further research needs are highlighted.

  12. Towards Semantically Integrated Models and Tools for Cyber-Physical Systems Design

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Fitzgerald, John; Woodcock, Jim

    2016-01-01

    We describe an approach to the model-based engineering of embedded and cyber-physical systems, based on the semantic integration of diverse discipline-specific notations and tools. Using the example of a small unmanned aerial vehicle, we explain the need for multiple notations and collaborative...

  13. Integrated landscape/hydrologic modeling tool for semiarid watersheds

    Science.gov (United States)

    Mariano Hernandez; Scott N. Miller

    2000-01-01

    An integrated hydrologic modeling/watershed assessment tool is being developed to aid in determining the susceptibility of semiarid landscapes to natural and human-induced changes across a range of scales. Watershed processes are by definition spatially distributed and are highly variable through time, and this approach is designed to account for their spatial and...

  14. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  15. Simulation modeling: a powerful tool for process improvement.

    Science.gov (United States)

    Boxerman, S B

    1996-01-01

    Simulation modeling provides an efficient means of examining the operation of a system under a variety of alternative conditions. This tool can potentially enhance a benchmarking project by providing a means for evaluating proposed modifications to the system or process under study.

  16. Designing a Training Tool for Imaging Mental Models

    Science.gov (United States)

    1990-11-01

    about how to weave together their disparate fields into a seamless web of knowledge . Learners often cannot visualize how the concepts and skills they...a seamless web of knowledge ? " How does the availability of a mental modeling tool enhance the ability of instructional designers to prepare

  17. Based on the Kansei Engineeringˊs Machine Tool Modelling Design Research%基于感性工学的机床造型设计研究

    Institute of Scientific and Technical Information of China (English)

    张莉

    2015-01-01

    From the perspective of kansei engineering did the research for machine tool modeling design. In the process of re-search,According to the machine tools modeling to collect relevant vocabulary and product kansei image sample pictures,con-struct the mapping model between the perceptual semantic and elements of machine tools modeling,used semantic differential method and quantification theory typeⅠto do analysis of data in the model. The results of the research will provide the basis for machine tools modeling design and design evaluation.%从感性工学的角度研究机床造型设计。在研究过程中,针对机床造型收集相关感性意向语汇和产品样本图,构建感性语意与机床造型要素之间的映射模型,并用语意差异法和数量化理论Ⅰ对模型中的数据进行处理分析。研究结果将为指导机床造型设计与设计评价提供依据。

  18. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  19. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  20. Cutting Tools. Youth Training Scheme. Core Exemplar Work Based Project.

    Science.gov (United States)

    Further Education Staff Coll., Blagdon (England).

    This trainer's guide is intended to assist supervisors of work-based career training projects in teaching students to compare the performance of two different types of engineering cutting tools and to determine their cost-effectiveness and efficiency. The guide is one in a series of core curriculum modules that is intended for use in combination…

  1. Digital Tools and Solutions for Inquiry-Based STEM Learning

    Science.gov (United States)

    Levin, Ilya, Ed.; Tsybulsky, Dina, Ed.

    2017-01-01

    In the digital age, the integration of technology has become a ubiquitous aspect of modern society. These advancements have significantly enhanced the field of education, allowing students to receive a better learning experience. "Digital Tools and Solutions for Inquiry-Based STEM Learning" is a comprehensive source of scholarly material…

  2. TENTube: A video-based connection tool supporting competence development

    NARCIS (Netherlands)

    Angehrn, Albert; Maxwell, Katrina

    2008-01-01

    Angehrn, A. A., & Maxwell, K. (2008). TENTube: A video-based connection tool supporting competence development. In H. W. Sligte & R. Koper (Eds.), Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technologica

  3. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  4. Greenhouse gases from wastewater treatment - A review of modelling tools.

    Science.gov (United States)

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced.

  5. AgMIP Training in Multiple Crop Models and Tools

    Science.gov (United States)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  6. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  7. Tool path planning based on conformal parameterization for meshes

    Institute of Scientific and Technical Information of China (English)

    Zhao Jibin; Zou Qiang; Li Lun; Zhou Bo

    2015-01-01

    The similarity property of conformal parameterization makes it able to locally preserve the shapes between a surface and its parameter domain, as opposed to common parameterization methods. A parametric tool path planning method is proposed in this paper through such parameterization of triangular meshes which is furthermore based on the geodesic on meshes. The parameterization has the properties of local similarity and free boundary which are exploited to simplify the formulas for computing path parameters, which play a fundamentally important role in tool path planning, and keep the path boundary-conformed and smooth. Experimental results are given to illustrate the effectiveness of the proposed methods, as well as the error analysis.

  8. Emerging Network-Based Tools in Movement Ecology.

    Science.gov (United States)

    Jacoby, David M P; Freeman, Robin

    2016-04-01

    New technologies have vastly increased the available data on animal movement and behaviour. Consequently, new methods deciphering the spatial and temporal interactions between individuals and their environments are vital. Network analyses offer a powerful suite of tools to disentangle the complexity within these dynamic systems, and we review these tools, their application, and how they have generated new ecological and behavioural insights. We suggest that network theory can be used to model and predict the influence of ecological and environmental parameters on animal movement, focusing on spatial and social connectivity, with fundamental implications for conservation. Refining how we construct and randomise spatial networks at different temporal scales will help to establish network theory as a prominent, hypothesis-generating tool in movement ecology.

  9. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    Directory of Open Access Journals (Sweden)

    Maike Kathrin Aurich

    2016-08-01

    Full Text Available Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools , we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  10. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models.

    Science.gov (United States)

    Aurich, Maike K; Fleming, Ronan M T; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  11. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Science.gov (United States)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  12. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  13. MODELING AND COMPENSATION TECHNIQUE FOR THE GEOMETRIC ERRORS OF FIVE-AXIS CNC MACHINE TOOLS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    One of the important trends in precision machining is the development of real-time error compensation technique.The error compensation for multi-axis CNC machine tools is very difficult and attractive.The modeling for the geometric error of five-axis CNC machine tools based on multi-body systems is proposed.And the key technique of the compensation-identifying geometric error parameters-is developed.The simulation of cutting workpiece to verify the modeling based on the multi-body systems is also considered.

  14. The Design of Tools for Sketching Sensor-Based Interaction

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2012-01-01

    , flexibility and cost, aimed at wearable and ultra-mobile prototyping where fast reaction is needed (e.g. in controlling sound), and we discuss the general issues facing this category of embodied interaction design tools. We then present the platform in more detail, both regarding hard- ware and software....... In the brief evaluation, we present our initial experiences with the platform both in design projects and in teaching. We conclude that DUL Radio does seem to be a relatively easy-to-use tool for sketching sensor-based interaction compared to other solutions, but that there are many ways to improve it. Target...... users include designers, students, artists etc. with minimal programming and hardware skills, but this paper adresses the issues with designing the tools, which includes technical details....

  15. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  16. A Scheduling System Based on Rules of the Machine Tools in FMS

    Institute of Scientific and Technical Information of China (English)

    LI De-xin; ZHAO Hua-qun; JIA Jie; LU Yan-jun

    2003-01-01

    In this paper, a model of the scheduling of machine tools in the flexible manufacturing line is presented by intensive analysis and research of the mathematical method of traditional scheduling. The various factors correlative with machine tools in the flexible manufacturing line are fully considered in this system. Aiming at this model, an intelligent decision system based on rules and simulation technolo-gy integration is constructed by using the OOP ( Object-Orented Programming) method, and the simula-tion experiment analysis is carried out. It is shown from the results that the model is better in practice.

  17. Evaluating EML Modeling Tools for Insurance Purposes: A Case Study

    Directory of Open Access Journals (Sweden)

    Mikael Gustavsson

    2010-01-01

    Full Text Available As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

  18. Knowledge Based Product Configuration - a documentatio tool for configuration projects

    DEFF Research Database (Denmark)

    Hvam, Lars; Malis, Martin

    2003-01-01

    How can complex product models be documented in a formalised way that consider both development and maintenance? The need for an effective documentation tool has emerged in order to document the development of product models. The product models have become more and more complex and comprehensive....... A lot of knowledge isput into these systems and many domain experts are involved. This calls for an effective documentation system in order to structure this knowledge in a way that fits to the systems. Standard configuration systems do not support this kind of documentation. The chapter deals...

  19. Cloud-Based Computational Tools for Earth Science Applications

    Science.gov (United States)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  20. Nephrectomized and hepatectomized animal models as tools in preclinical pharmacokinetics.

    Science.gov (United States)

    Vestergaard, Bill; Agersø, Henrik; Lykkesfeldt, Jens

    2013-08-01

    Early understanding of the pharmacokinetics and metabolic patterns of new drug candidates is essential for selection of optimal candidates to move further in to the drug development process. In vitro methodologies can be used to investigate metabolic patterns, but in general, they lack several aspects of the whole-body physiology. In contrast, the complexity of intact animals does not necessarily allow individual processes to be identified. Animal models lacking a major excretion organ can be used to investigate these individual metabolic processes. Animal models of nephrectomy and hepatectomy have considerable potential as tools in preclinical pharmacokinetics to assess organs of importance for drug clearance and thereby knowledge of potential metabolic processes to manipulate to improve pharmacokinetic properties of the molecules. Detailed knowledge of anatomy and surgical techniques is crucial to successfully establish the models, and a well-balanced anaesthesia and adequate monitoring of the animals are also of major importance. An obvious drawback of animal models lacking an organ is the disruption of normal homoeostasis and the induction of dramatic and ultimately mortal systemic changes in the animals. Refining of the surgical techniques and the post-operative supportive care of the animals can increase the value of these models by minimizing the systemic changes induced, and thorough validation of nephrectomy and hepatectomy models is needed before use of such models as a tool in preclinical pharmacokinetics. The present MiniReview discusses pros and cons of the available techniques associated with establishing nephrectomy and hepatectomy models.

  1. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  2. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... strategies have different goals e.g. fast response over disturbances, optimum power efficiency over a wider range of wind speeds, voltage ride-through capability including grid support. A dynamic model of a DC connection for active stall wind farms to the grid including the control is also implemented...

  3. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  4. Development of a visualization tool for integrated surface water-groundwater modeling

    Science.gov (United States)

    Tian, Yong; Zheng, Yi; Zheng, Chunmiao

    2016-01-01

    Physically-based, fully integrated surface water (SW)-groundwater (GW) models have been increasingly used in water resources research and management. The integrated modeling involves a large amount of scientific data. The use of three-dimensional (3D) visualization software to integrate all the scientific data into a comprehensive system can facilitate the interpretation and validation of modeling results. Nevertheless, at present few software tools can efficiently perform data visualization for integrated SW-GW modeling. In this study, a visualization tool named IHM3D was designed and developed specifically for integrated SW-GW modeling. In IHM3D, spatially distributed model inputs/outputs and geo-referenced data sets are visualized in a virtual globe-based 3D environment. End users can conveniently explore and validate modeling results within the 3D environment. A GSLFOW (an integrated SW-GW model developed by USGS) modeling case in the Heihe River Basin (Northwest China) was used to demonstrate the applicability of IHM3D at a large basin scale. The visualization of the modeling results significantly improved the understanding of the complex hydrologic cycle in this water-limited area, and provided insights into the regional water resources management. This study shows that visualization tools like IHM3D can promote data and model sharing in the water resources research community, and make it more practical to perform complex hydrological modeling in real-world water resources management.

  5. A Monte Carlo-based treatment-planning tool for ion beam therapy

    CERN Document Server

    Böhlen, T T; Dosanjh, M; Ferrari, A; Haberer, T; Parodi, K; Patera, V; Mairan, A

    2013-01-01

    Ion beam therapy, as an emerging radiation therapy modality, requires continuous efforts to develop and improve tools for patient treatment planning (TP) and research applications. Dose and fluence computation algorithms using the Monte Carlo (MC) technique have served for decades as reference tools for accurate dose computations for radiotherapy. In this work, a novel MC-based treatment-planning (MCTP) tool for ion beam therapy using the pencil beam scanning technique is presented. It allows single-field and simultaneous multiple-fields optimization for realistic patient treatment conditions and for dosimetric quality assurance for irradiation conditions at state-of-the-art ion beam therapy facilities. It employs iterative procedures that allow for the optimization of absorbed dose and relative biological effectiveness (RBE)-weighted dose using radiobiological input tables generated by external RBE models. Using a re-implementation of the local effect model (LEM), theMCTP tool is able to perform TP studies u...

  6. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  7. Web-Based Tools for Collaborative Evaluation of Learning Resources

    Directory of Open Access Journals (Sweden)

    John C. Nesbit

    2005-10-01

    Full Text Available The emergence of large repositories of web-based learning resources has increased the need for valid and usable evaluation tools. This paper reviews current approaches to learning object evaluation and introduces eLera, a set of web-based tools we have developed for communities of teachers, learners, instructional designers and developers. Compatible with current metadata standards, eLera provides a learning object review instrument (LORI and other features supporting collaborative evaluation. eLera provides limited translation of evaluations and subject taxonomies across communities using different languages and terminology. eLera is designed to assist researchers to gather data on evaluation processes and has been used to teach educators how to assess the quality of multimedia learning resources.

  8. The Design of Tools for Sketching Sensor-Based Interaction

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2012-01-01

    In this paper we motivate, present, and give an initial evaluation of DUL Radio, a small wireless toolkit for sketching sensor-based interaction. In the motivation, we discuss the purpose of this specific platform, which aims to balance ease-of-use (learning, setup, initialization), size, speed......, flexibility and cost, aimed at wearable and ultra-mobile prototyping where fast reaction is needed (e.g. in controlling sound), and we discuss the general issues facing this category of embodied interaction design tools. We then present the platform in more detail, both regarding hard- ware and software....... In the brief evaluation, we present our initial experiences with the platform both in design projects and in teaching. We conclude that DUL Radio does seem to be a relatively easy-to-use tool for sketching sensor-based interaction compared to other solutions, but that there are many ways to improve it. Target...

  9. Application of Process Modeling Tools to Ship Design

    Science.gov (United States)

    2011-05-01

    Release; Distribution is unlimited. Different People – Different Preferences • We need to view process data in multiple formats. – DSM – GANTT Charts...CertificatePrograms/tRI Scheduling Software Spreadsheet Software Info Modeling Software DSM Tool Schedules IDEF Diagrams Spreadsheets DSM Schema 6/2...Chart Gantt Chart DSM Flow Chart by Geography 6/2/2011 3:41 PM 16Statement A: Approved for Public Release; Distribution is unlimited. Multi-domain

  10. Internet MEMS design tools based on component technology

    Science.gov (United States)

    Brueck, Rainer; Schumer, Christian

    1999-03-01

    The micro electromechanical systems (MEMS) industry in Europe is characterized by small and medium sized enterprises specialized on products to solve problems in specific domains like medicine, automotive sensor technology, etc. In this field of business the technology driven design approach known from micro electronics is not appropriate. Instead each design problem aims at its own, specific technology to be used for the solution. The variety of technologies at hand, like Si-surface, Si-bulk, LIGA, laser, precision engineering requires a huge set of different design tools to be available. No single SME can afford to hold licenses for all these tools. This calls for a new and flexible way of designing, implementing and distributing design software. The Internet provides a flexible manner of offering software access along with methodologies of flexible licensing e.g. on a pay-per-use basis. New communication technologies like ADSL, TV cable of satellites as carriers promise to offer a bandwidth sufficient even for interactive tools with graphical interfaces in the near future. INTERLIDO is an experimental tool suite for process specification and layout verification for lithography based MEMS technologies to be accessed via the Internet. The first version provides a Java implementation even including a graphical editor for process specification. Currently, a new version is brought into operation that is based on JavaBeans component technology. JavaBeans offers the possibility to realize independent interactive design assistants, like a design rule checking assistants, a process consistency checking assistants, a technology definition assistants, a graphical editor assistants, etc. that may reside distributed over the Internet, communicating via Internet protocols. Each potential user thus is able to configure his own dedicated version of a design tool set dedicated to the requirements of the current problem to be solved.

  11. Web-based Information Tools and Communication and Participation Strategies

    OpenAIRE

    2007-01-01

    The report summarizes the activities and the action-oriented research work of Pilot Project 4 (PP4) “Web-based Information Tools and Communication and Participation Strategies“. Research objectives were divided in two areas: a) communication issues, namely the development of a Project-Homepage with interactive elements (http://www.sustainable-hyderabad.in/ ) and two documentary films and b) participation issues, namely citizens participation in India as a whole and in Hyderabad in particular....

  12. MIMOX: a web tool for phage display based epitope mapping

    Directory of Open Access Journals (Sweden)

    Honda Wataru

    2006-10-01

    Full Text Available Abstract Background Phage display is widely used in basic research such as the exploration of protein-protein interaction sites and networks, and applied research such as the development of new drugs, vaccines, and diagnostics. It has also become a promising method for epitope mapping. Research on new algorithms that assist and automate phage display based epitope mapping has attracted many groups. Most of the existing tools have not been implemented as an online service until now however, making it less convenient for the community to access, utilize, and evaluate them. Results We present MIMOX, a free web tool that helps to map the native epitope of an antibody based on one or more user supplied mimotopes and the antigen structure. MIMOX was coded in Perl using modules from the Bioperl project. It has two sections. In the first section, MIMOX provides a simple interface for ClustalW to align a set of mimotopes. It also provides a simple statistical method to derive the consensus sequence and embeds JalView as a Java applet to view and manage the alignment. In the second section, MIMOX can map a single mimotope or a consensus sequence of a set of mimotopes, on to the corresponding antigen structure and search for all of the clusters of residues that could represent the native epitope. NACCESS is used to evaluate the surface accessibility of the candidate clusters; and Jmol is embedded to view them interactively in their 3D context. Initial case studies show that MIMOX can reproduce mappings from existing tools such as FINDMAP and 3DEX, as well as providing novel, rational results. Conclusion A web-based tool called MIMOX has been developed for phage display based epitope mapping. As a publicly available online service in this area, it is convenient for the community to access, utilize, and evaluate, complementing other existing programs. MIMOX is freely available at http://web.kuicr.kyoto-u.ac.jp/~hjian/mimox.

  13. Schistosomiasis japonica: modelling as a tool to explore transmission patterns.

    Science.gov (United States)

    Xu, Jun-Fang; Lv, Shan; Wang, Qing-Yun; Qian, Men-Bao; Liu, Qin; Bergquist, Robert; Zhou, Xiao-Nong

    2015-01-01

    Modelling is an important tool for the exploration of Schistosoma japonicum transmission patterns. It provides a general theoretical framework for decision-makers and lends itself specifically to assessing the progress of the national control programme by following the outcome of surveys. The challenge of keeping up with the many changes of social, ecological and environmental factors involved in control activities is greatly facilitated by modelling that can also indicate which activities are critical and which are less important. This review examines the application of modelling tools in the epidemiological study of schistosomiasis japonica during the last 20 years and explores the application of enhanced models for surveillance and response. Updated and timely information for decision-makers in the national elimination programme is provided but, in spite of the new modelling techniques introduced, many questions remain. Issues on application of modelling are discussed with the view to improve the current situation with respect to schistosomiasis japonica. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Virtual Sensor for Calibration of Thermal Models of Machine Tools

    Directory of Open Access Journals (Sweden)

    Alexander Dementjev

    2014-01-01

    strictly depends on the accuracy of these machines, but they are prone to deformation caused by their own heat. The deformation needs to be compensated in order to assure accurate production. So an adequate model of the high-dimensional thermal deformation process must be created and parameters of this model must be evaluated. Unfortunately, such parameters are often unknown and cannot be calculated a priori. Parameter identification during real experiments is not an option for these models because of its high engineering and machine time effort. The installation of additional sensors to measure these parameters directly is uneconomical. Instead, an effective calibration of thermal models can be reached by combining real and virtual measurements on a machine tool during its real operation, without additional sensors installation. In this paper, a new approach for thermal model calibration is presented. The expected results are very promising and can be recommended as an effective solution for this class of problems.

  15. Process Modeling In Cold Forging Considering The Process-Tool-Machine Interactions

    Science.gov (United States)

    Kroiss, Thomas; Engel, Ulf; Merklein, Marion

    2010-06-01

    In this paper, a methodic approach is presented for the determination and modeling of the axial deflection characteristic for the whole system of stroke-controlled press and tooling system. This is realized by a combination of experiment and FE simulation. The press characteristic is uniquely measured in experiment. The tooling system characteristic is determined in FE simulation to avoid experimental investigations on various tooling systems. The stiffnesses of press and tooling system are combined to a substitute stiffness that is integrated into the FE process simulation as a spring element. Non-linear initial effects of the press are modeled with a constant shift factor. The approach was applied to a full forward extrusion process on a press with C-frame. A comparison between experiments and results of the integrated FE simulation model showed a high accuracy of the FE model. The simulation model with integrated deflection characteristic represents the entire process behavior and can be used for the calculation of a mathematical process model based on variant simulations and response surfaces. In a subsequent optimization step, an adjusted process and tool design can be determined, that compensates the influence of the deflections on the workpiece dimensions leading to high workpiece accuracy. Using knowledge on the process behavior, the required number of variant simulations was reduced.

  16. CREST Cost of Renewable Energy Spreadsheet Tool: A Model for Developing Cost-Based Incentives in the United States; User Manual Version 4, August 2009 - March 2011 (Updated July 2013)

    Energy Technology Data Exchange (ETDEWEB)

    Gifford, J. S.; Grace, R. C.

    2013-07-01

    The objective of this document is to help model users understand how to use the CREST model to support renewable energy incentives, FITs, and other renewable energy rate-setting processes. This user manual will walk the reader through the spreadsheet tool, including its layout and conventions, offering context on how and why it was created. This user manual will also provide instructions on how to populate the model with inputs that are appropriate for a specific jurisdiction's policymaking objectives and context. Finally, the user manual will describe the results and outline how these results may inform decisions about long-term renewable energy support programs.

  17. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    Science.gov (United States)

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  18. Investigating Learner Attitudes toward E-Books as Learning Tools: Based on the Activity Theory Approach

    Science.gov (United States)

    Liaw, Shu-Sheng; Huang, Hsiu-Mei

    2016-01-01

    This paper investigates the use of e-books as learning tools in terms of learner satisfaction, usefulness, behavioral intention, and learning effectiveness. Based on the activity theory approach, this research develops a research model to understand learner attitudes toward e-books in two physical sizes: 10? and 7?. Results suggest that screen…

  19. Investigating Learner Attitudes toward E-Books as Learning Tools: Based on the Activity Theory Approach

    Science.gov (United States)

    Liaw, Shu-Sheng; Huang, Hsiu-Mei

    2016-01-01

    This paper investigates the use of e-books as learning tools in terms of learner satisfaction, usefulness, behavioral intention, and learning effectiveness. Based on the activity theory approach, this research develops a research model to understand learner attitudes toward e-books in two physical sizes: 10? and 7?. Results suggest that screen…

  20. Adapting Learning Activities: a Case Study of IMS LD based Script and Tooling

    NARCIS (Netherlands)

    Miao, Yongwu

    2009-01-01

    Miao, Y. (2009). Adapting Learning Activities: a Case Study of IMS LD based Script and Tooling. Paper presented at workshop "Adapting Activities Modeled by CSCL Scripts" of the 8th International Conference “Computer Supported Collaborative Learning” (CSCL’09). June, 8-13, 2009, Rhodes, Greece.

  1. Adapting Learning Activities: a Case Study of IMS LD based Script and Tooling

    NARCIS (Netherlands)

    Miao, Yongwu

    2009-01-01

    Miao, Y. (2009). Adapting Learning Activities: a Case Study of IMS LD based Script and Tooling. Presentation at workshop "Adapting Activities Modeled by CSCL Scripts" of the 8th International Conference “Computer Supported Collaborative Learning” (CSCL’09). June, 8-13, 2009, Rhodes, Greece.

  2. Error Model and Accuracy Calibration of 5-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Fangyu Pan

    2013-08-01

    Full Text Available To improve the machining precision and reduce the geometric errors for 5-axis machinetool, error model and calibration are presented in this paper. Error model is realized by the theory of multi-body system and characteristic matrixes, which can establish the relationship between the cutting tool and the workpiece in theory. The accuracy calibration was difficult to achieve, but by a laser approach-laser interferometer and laser tracker, the errors can be displayed accurately which is benefit for later compensation.

  3. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  4. A Tool for Sharing Empirical Models of Climate Impacts

    Science.gov (United States)

    Rising, J.; Kopp, R. E.; Hsiang, S. M.

    2013-12-01

    Scientists, policy advisors, and the public struggle to synthesize the quickly evolving empirical work on climate change impacts. The Integrated Assessment Models (IAMs) used to estimate the impacts of climate change and the effects of adaptation and mitigation policies can also benefit greatly from recent empirical results (Kopp, Hsiang & Oppenheimer, Impacts World 2013 discussion paper). This paper details a new online tool for exploring, analyzing, combining, and communicating a wide range of impact results, and supporting their integration into IAMs. The tool uses a new database of statistical results, which researchers can expand both in depth (by providing additional results that describing existing relationships) and breadth (by adding new relationships). Scientists can use the tool to quickly perform meta-analyses of related results, using Bayesian techniques to produce pooled and partially-pooled posterior distributions. Policy advisors can apply the statistical results to particular contexts, and combine different kinds of results in a cost-benefit framework. For example, models of the impact of temperature changes on agricultural yields can be first aggregated to build a best-estimate of the effect under given assumptions, then compared across countries using different temperature scenarios, and finally combined to estimate a social cost of carbon. The general public can better understand the many estimates of climate impacts and their range of uncertainty by exploring these results dynamically, with maps, bar charts, and dose-response-style plots. Front page of the climate impacts tool website. Sample "collections" of models, within which all results are estimates of the same fundamental relationship, are shown on the right. Simple pooled result for Gelman's "8 schools" example. Pooled results are calculated analytically, while partial-pooling (Bayesian hierarchical estimation) uses posterior simulations.

  5. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    . In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling...

  6. ANN Based Tool Condition Monitoring System for CNC Milling Machines

    Directory of Open Access Journals (Sweden)

    Mota-Valtierra G.C.

    2011-10-01

    Full Text Available Most of the companies have as objective to manufacture high-quality products, then by optimizing costs, reducing and controlling the variations in its production processes it is possible. Within manufacturing industries a very important issue is the tool condition monitoring, since the tool state will determine the quality of products. Besides, a good monitoring system will protect the machinery from severe damages. For determining the state of the cutting tools in a milling machine, there is a great variety of models in the industrial market, however these systems are not available to all companies because of their high costs and the requirements of modifying the machining tool in order to attach the system sensors. This paper presents an intelligent classification system which determines the status of cutt ers in a Computer Numerical Control (CNC milling machine. This tool state is mainly detected through the analysis of the cutting forces drawn from the spindle motors currents. This monitoring system does not need sensors so it is no necessary to modify the machine. The correct classification is made by advanced digital signal processing techniques. Just after acquiring a signal, a FIR digital filter is applied to the data to eliminate the undesired noisy components and to extract the embedded force components. A Wavelet Transformation is applied to the filtered signal in order to compress the data amount and to optimize the classifier structure. Then a multilayer perceptron- type neural network is responsible for carrying out the classification of the signal. Achieving a reliability of 95%, the system is capable of detecting breakage and a worn cutter.

  7. Suction-based grasping tool for removal of regular- and irregular-shaped intraocular foreign bodies.

    Science.gov (United States)

    Erlanger, Michael S; Velez-Montoya, Raul; Mackenzie, Douglas; Olson, Jeffrey L

    2013-01-01

    To describe a suction-based grasping tool for the surgical removal of irregular-shaped and nonferromagnetic intraocular foreign bodies. A surgical tool with suction capabilities, consisting of a stainless steel shaft with a plastic handle and a customizable and interchangeable suction tip, was designed in order to better engage and manipulate irregular-shaped in-traocular foreign bodies of various sizes and physical properties. The maximal suction force and surgical capabilities were assessed in the laboratory and on a cadaveric eye vitrectomy model. The suction force of the water-tight seal between the intraocular foreign body and the suction tip was estimated to be approximately 40 MN. During an open-sky vitrectomy in a porcine model, the device was successful in engaging and firmly securing foreign bodies of different sizes and shapes. The suction-based grasping tool enables removal of irregular-shaped and nonferromagnetic foreign bodies. Copyright 2013, SLACK Incorporated.

  8. Modeling the Effects of Tool Shoulder and Probe Profile Geometries on Friction Stirred Aluminum Welds Using Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    H.K.Mohanty; M.M.Mahapatra; P.Kumar; P.Biswas; N.R.Mandal

    2012-01-01

    The present paper discusses the modeling of tool geometry effects on the friction stir aluminum welds using response surface methodology.The friction stir welding tools were designed with different shoulder and tool probe geometries based on a design matrix.The matrix for the tool designing was made for three types of tools,based on three types of probes,with three levels each for defining the shoulder surface type and probe profile geometries.Then,the effects of tool shoulder and probe geometries on friction stirred aluminum welds were experimentally investigated with respect to weld strength,weld cross section area,grain size of weld and grain size of thermo-mechanically affected zone.These effects were modeled using multiple and response surface regression analysis.The response surface regression modeling were found to be appropriate for defining the friction stir weldment characteristics.

  9. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    Science.gov (United States)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  10. Modeling as a tool for process control: alcoholic fermentation

    Energy Technology Data Exchange (ETDEWEB)

    Tayeb, A.M.; Ashour, I.A.; Mostafa, N.A. (El-Minia Univ. (EG). Faculty of Engineering)

    1991-01-01

    The results of the alcoholic fermentation of beet sugar molasses and wheat milling residues (Akalona) were fed into a computer program. Consequently, the kinetic parameters for these fermentation reactions were determined. These parameters were put into a kinetic model. Next, the model was tested, and the results obtained were compared with the experimental results of both beet molasses and Akalona. The deviation of the experimental results from the results obtained from the model was determined. An acceptable deviation of 1.2% for beet sugar molasses and 3.69% for Akalona was obtained. Thus, the present model could be a tool for chemical engineers working in fermentation processes both with respect to the control of the process and the design of the fermentor. (Author).

  11. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  12. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  13. A cloud based tool for knowledge exchange on local scale flood risk.

    Science.gov (United States)

    Wilkinson, M E; Mackay, E; Quinn, P F; Stutter, M; Beven, K J; MacLeod, C J A; Macklin, M G; Elkhatib, Y; Percy, B; Vitolo, C; Haygarth, P M

    2015-09-15

    There is an emerging and urgent need for new approaches for the management of environmental challenges such as flood hazard in the broad context of sustainability. This requires a new way of working which bridges disciplines and organisations, and that breaks down science-culture boundaries. With this, there is growing recognition that the appropriate involvement of local communities in catchment management decisions can result in multiple benefits. However, new tools are required to connect organisations and communities. The growth of cloud based technologies offers a novel way to facilitate this process of exchange of information in environmental science and management; however, stakeholders need to be engaged with as part of the development process from the beginning rather than being presented with a final product at the end. Here we present the development of a pilot Local Environmental Virtual Observatory Flooding Tool. The aim was to develop a cloud based learning platform for stakeholders, bringing together fragmented data, models and visualisation tools that will enable these stakeholders to make scientifically informed environmental management decisions at the local scale. It has been developed by engaging with different stakeholder groups in three catchment case studies in the UK and a panel of national experts in relevant topic areas. However, these case study catchments are typical of many northern latitude catchments. The tool was designed to communicate flood risk in locally impacted communities whilst engaging with landowners/farmers about the risk of runoff from the farmed landscape. It has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. The pilot tool combines cloud based services, local catchment datasets, a hydrological model and bespoke visualisation tools to explore real time hydrometric data and the impact of flood risk caused by future land use changes. The novel aspects of the

  14. A unified tool for performance modelling and prediction

    Energy Technology Data Exchange (ETDEWEB)

    Gilmore, Stephen [Laboratory for Foundations of Computer Science, University of Edinburgh, King' s Buildings, Mayfield Road, Edinburgh, Scotland EH9 3JZ (United Kingdom)]. E-mail: stg@inf.ed.ac.uk; Kloul, Leila [Laboratory for Foundations of Computer Science, University of Edinburgh, King' s Buildings, Mayfield Road, Edinburgh, Scotland EH9 3JZ (United Kingdom)

    2005-07-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony.

  15. Laser melting of carbide tool surface: Model and experimental studies

    Energy Technology Data Exchange (ETDEWEB)

    Yilbas, B.S., E-mail: bsyilbas@kfupm.edu.sa [ME Department, King Fahd University of Petroleum and Minerals, KFUPM Box 1913, Dhahran 31261 (Saudi Arabia); Shuja, S.Z.; Khan, S.M.A.; Aleem, A. [ME Department, King Fahd University of Petroleum and Minerals, KFUPM Box 1913, Dhahran 31261 (Saudi Arabia)

    2009-09-15

    Laser controlled melting is one of the methods to achieve structural integrity in the surface region of the carbide tools. In the present study, laser heating of carbide cutting tool and temperature distribution in the irradiated region are examined. The phase change process during the heating is modeled using the enthalpy-porosity method. The influence of laser pulse intensity distribution across the irradiated surface ({beta}) on temperature distribution and melt formation is investigated. An experiment is carried out and the microstructural changes due to laser consecutive pulse heating is examined using the scanning electron microscope (SEM). It is found that melt depth predicted agrees with the experimental results. The maximum depth of the melt layer moves away from the symmetry axis with increasing {beta}.

  16. Laser melting of carbide tool surface: Model and experimental studies

    Science.gov (United States)

    Yilbas, B. S.; Shuja, S. Z.; Khan, S. M. A.; Aleem, A.

    2009-09-01

    Laser controlled melting is one of the methods to achieve structural integrity in the surface region of the carbide tools. In the present study, laser heating of carbide cutting tool and temperature distribution in the irradiated region are examined. The phase change process during the heating is modeled using the enthalpy-porosity method. The influence of laser pulse intensity distribution across the irradiated surface ( β) on temperature distribution and melt formation is investigated. An experiment is carried out and the microstructural changes due to laser consecutive pulse heating is examined using the scanning electron microscope (SEM). It is found that melt depth predicted agrees with the experimental results. The maximum depth of the melt layer moves away from the symmetry axis with increasing β.

  17. A practical tool for modeling biospecimen user fees.

    Science.gov (United States)

    Matzke, Lise; Dee, Simon; Bartlett, John; Damaraju, Sambasivarao; Graham, Kathryn; Johnston, Randal; Mes-Masson, Anne-Marie; Murphy, Leigh; Shepherd, Lois; Schacter, Brent; Watson, Peter H

    2014-08-01

    The question of how best to attribute the unit costs of the annotated biospecimen product that is provided to a research user is a common issue for many biobanks. Some of the factors influencing user fees are capital and operating costs, internal and external demand and market competition, and moral standards that dictate that fees must have an ethical basis. It is therefore important to establish a transparent and accurate costing tool that can be utilized by biobanks and aid them in establishing biospecimen user fees. To address this issue, we built a biospecimen user fee calculator tool, accessible online at www.biobanking.org . The tool was built to allow input of: i) annual operating and capital costs; ii) costs categorized by the major core biobanking operations; iii) specimen products requested by a biobank user; and iv) services provided by the biobank beyond core operations (e.g., histology, tissue micro-array); as well as v) several user defined variables to allow the calculator to be adapted to different biobank operational designs. To establish default values for variables within the calculator, we first surveyed the members of the Canadian Tumour Repository Network (CTRNet) management committee. We then enrolled four different participants from CTRNet biobanks to test the hypothesis that the calculator tool could change approaches to user fees. Participants were first asked to estimate user fee pricing for three hypothetical user scenarios based on their biobanking experience (estimated pricing) and then to calculate fees for the same scenarios using the calculator tool (calculated pricing). Results demonstrated significant variation in estimated pricing that was reduced by calculated pricing, and that higher user fees are consistently derived when using the calculator. We conclude that adoption of this online calculator for user fee determination is an important first step towards harmonization and realistic user fees.

  18. Tools and Products of Real-Time Modeling: Opportunities for Space Weather Forecasting

    Science.gov (United States)

    Hesse, Michael

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of draft Space Weather forecasting tools. This presentation will focus on the last element. Specifically, we will discuss present capabilities, and the potential to derive further tools. These capabilities will be interpreted in the context of a broad-based, bootstrapping activity for modern Space Weather forecasting.

  19. MGP : a tool for wide range temperature modelling

    Energy Technology Data Exchange (ETDEWEB)

    Morales, A.F. [Inst. Tecnologico Autonomo de Mexico, Mexico City (Mexico); Seisdedos, L.V. [Univ. de Oriente, Santiago de Cuba (Cuba). Dept. de Control Automatico

    2006-07-01

    This paper proposed a practical temperature modelling tool that used genetic multivariate polynomials to determine polynomial expressions of enthalpy and empirical heat transfer equations in superheaters. The model was designed to transform static parameter estimations from distributed into lumped parameter systems. Two dynamic regimes were explored: (1) a power dynamics regime containing major inputs and outputs needed for overall plant control; and (2) a steam temperature dynamics scheme that considered consecutive superheater sections considered in terms of cooling water mass flow and steam mass flow. The single lumped parameters model was developed to provide temperature control for a fossil fuel-fired power plant. The design procedure used enthalpy to determine the plant's energy balance. The enthalpy curve was seen as a function of either temperature and steam pressure. A graphic simulation tool was used to optimize the model by comparing real and simulated plant data. The study showed that the amount of energy taken by the steam mass flow per time unit can be calculated by measuring temperatures and pressures at both ends of the superheater. An algorithm was then developed to determine the polynomial's coefficients according to best curve fitting over the training set and best maximum errors. It was concluded that a unified approach is now being developed to simulate and emulate the dynamics of steam temperature for each section's attemporator-superheater. 14 refs., 3 tabs., 5 figs.

  20. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL......, the first transformation language implementing the principles of Transparent Model Transformation: syntax, environment, and execution transparency. VMTL works by weaving a transformation aspect into its host modeling language. We show how our implementation of VMTL turns any model editor into a flexible...

  1. Modeling in the Classroom: An Evolving Learning Tool

    Science.gov (United States)

    Few, A. A.; Marlino, M. R.; Low, R.

    2006-12-01

    Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage

  2. Web based geoprocessing tool for coverage data handling

    Science.gov (United States)

    Kumar, K.; Saran, S.

    2014-11-01

    With the advancements in GIS technologies and extensive use of OGC Web Services, geospatial resources and services are becoming progressively copious and convenient over the network. The application of OGC WCS (Web Coverage Service) and WFS (Web Feature Service) standards for geospatial raster and vector data has resulted in an opulent pool of interoperable geodata resources waiting to be used for analytical or modelling purposes. The issue of availing geospatial data processing with the aid of standardised web services was attended to by the OGC WPS (Web Processing Service) 1.0.0 specifications (Schut, 2007) which elucidate WPS as a standard interface which serves for the promulgation of geo-processes and consumption of those processes by the clients. This paper outlines the design and implementation of a geo-processing tool utilizing coverage data. The geo-process selected for application is the calculation of Normalized Difference Vegetative Index (NDVI), one of the globally used indices for vegetation cover monitoring. The system is realised using the Geospatial Data Abstraction Library (GDAL) and Python. The tool accesses the WCS server using the parameters defined in the XML request. The geo-process upon execution, performs the computations over the coverage data and generates the NDVI output. Since open source technology and standards are being used more often, especially in the field of scientific research, so our implementation is also built by using open source tools only.

  3. Development of IFC based fire safety assesment tools

    DEFF Research Database (Denmark)

    Taciuc, Anca; Karlshøj, Jan; Dederichs, Anne

    2016-01-01

    Building Information Models (BIM) to evacuate the safety level in the building during the conceptual design stage. The findings show that the developed tools can be useful in AEC industry. Integrating BIM from conceptual design stage for analyzing the fire safety level can ensure precision in further......Due to the impact that the fire safety design has on the building's layout and on other complementary systems, as installations, it is important during the conceptual design stage to evaluate continuously the safety level in the building. In case that the task is carried out too late, additional...

  4. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  5. Bio-Logic Builder: A Non-Technical Tool for Building Dynamical, Qualitative Models

    Science.gov (United States)

    Helikar, Tomáš; Kowal, Bryan; Madrahimov, Alex; Shrestha, Manish; Pedersen, Jay; Limbu, Kahani; Thapa, Ishwor; Rowley, Thaine; Satalkar, Rahul; Kochi, Naomi; Konvalina, John; Rogers, Jim A.

    2012-01-01

    Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism) of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized “bio-logic” modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool. PMID:23082121

  6. Bio-logic builder: a non-technical tool for building dynamical, qualitative models.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    Full Text Available Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized "bio-logic" modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool.

  7. Computer-based tools for decision support at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high` level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ``glue`` or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  8. Computer-based tools for decision support at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  9. Mahalanobis Taguchi system based criteria selection tool for agriculture crops

    Indian Academy of Sciences (India)

    N DEEPA; K GANESAN

    2016-12-01

    Agriculture crop selection cannot be formulated from one criterion but from multiple criteria. A list of criteria for crop selection was identified through literature survey and agricultural experts. The identified criteria were grouped into seven main criteria namely, soil, water, season, input, support, facilities and threats. In this paper, Mahalanobis Taguchi system based tool was developed for identification of useful set of criteriawhich is a subset of the original criteria, for taking decision on crop selection in a given agriculture land. The combination of Mahalanobis distance and Taguchi method is used for identification of important criteria. Matlab software was used to develop the tool. After entering the values for each main criteria in the tool, it will process the value and identify the useful sub-criteria under each main criteria for selecting the suitable crop in a givenagriculture land. Instead of considering all criteria, one can use these useful set of criteria under each main criteria for taking decision on crop selection in agriculture.

  10. OPSMODEL, an or-orbit operations simulation modeling tool for Space Station

    Science.gov (United States)

    Davis, William T.; Wright, Robert L.

    1988-01-01

    The 'OPSMODEL' operations-analysis and planning tool simulates on-orbit crew operations for the NASA Space Station, furnishing a quantitative measure of the effectiveness of crew activities in various alternative Station configurations while supporting engineering and cost analyses. OPSMODEL is entirely data-driven; the top-down modeling structure of the software allows the user to control both the content and the complexity level of model definition during data base population. Illustrative simulation samples are given.

  11. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  12. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  13. PV-WEB: internet-based PV information tool

    Energy Technology Data Exchange (ETDEWEB)

    Cowley, P.

    2003-07-01

    This report gives details of a project to create a web-based information system on photovoltaic (PV) systems for the British PV Association (PV-UK) for use by decision makers in government, the utilities, and the housing and construction sectors. The project, which aims to provide an easily accessible tool for UK companies, promote PV technology, increase competitiveness, and identify market opportunities, is described. The design of the web site and its implementation and the evolution are discussed, along with the maintenance of the site by PV-UK and the opportunities offered to PV-UK Members.

  14. An heuristic based practical tool for casting process design

    Energy Technology Data Exchange (ETDEWEB)

    Nanda, N.K.; Smith, K.A.; Voller, V.R.; Haberle, K.F. [Univ. of Minnesota, Minneapolis, MN (United States). Dept. of Civil Engineering

    1995-12-31

    The work in this paper reports on an heuristic based computer tool directed at casting process design; in particular key design parameters, such as part orientation, location of sprues, feeding rates, etc. The underlying principal used is that a given casting can be represented on identifying and classifying its critical features. The input to the system consists of the attributes of the features and the graphical output provides semi-quantitative information on key design parameters. Results on real castings match those of the expert casting designers and in some cases potential design improvements have been suggested by the system.

  15. 基于罗伊适应模式自我概念评估工具的应用进展%Application progress on assessment tools of self-concept based on Roy adaption model

    Institute of Scientific and Technical Information of China (English)

    蔡婷婷; 曹梅娟

    2016-01-01

    This article introduces the basic content of self-concept in Roy adaption model,its value for chronically diseased patients,and the development and application of the assessment tools guided by self-concept of Roy adaption model in China and overseas.It also puts forward some consideration of inadequacy about the assessment tools,in order to provide reference for nursing staff evaluating the self-concept of patients with chronic diseases and improve assessment tools of self-concept in the future.%介绍了罗伊适应模式中自我概念的内容及其对慢性病患者的应用价值及以罗伊适应模式自我概念为指导的评估工具在国内外的发展与应用情况,并针对评估工具的不足提出思考,以期为护理人员评估慢性病患者的自我概念及今后自我概念评估工具的改进提供参考。

  16. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.

    2001-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  17. Effects of Machine Tool Configuration on Its Dynamics Based on Orthogonal Experiment Method

    Institute of Scientific and Technical Information of China (English)

    GAO Xiangsheng; ZHANG Yidu; ZHANG Hongwei; WU Qiong

    2012-01-01

    In order to analyze the influence of configuration parameters on dynamic characteristics of machine tools in the working space,the configuration parameters have been suggested based on the orthogonal experiment method.Dynamic analysis of a milling machine,which is newly designed for producing turbine blades,has been conducted by utilizing the modal synthesis method.The finite element model is verified and updated by experimental modal analysis (EMA) of the machine tool.The result gained by modal synthesis method is compared with whole-model finite element method (FEM) result as well.According to the orthogonal experiment method,four configuration parameters of machine tool are considered as four factors for dynamic characteristics.The influence of configuration parameters on the first three natural frequencies is obtained by range analysis.It is pointed out that configuration parameter is the most important factor affecting the fundamental frequency of machine tools,and configuration parameter has less effect on lower-order modes of the system than others.The combination of configuration parameters which makes the fundamental frequency reach the maximum value is provided.Through demonstration,the conclusion can be drawn that the influence of configuration parameters on the natural frequencies of machine tools can be analyzed explicitly by the orthogonal experiment method,which offers a new method for estimating the dynamic characteristics of machine tools.

  18. Investigation of surface finishing of carbon based coated tools for dry deep drawing of aluminium alloys

    Science.gov (United States)

    Steiner, J.; Andreas, K.; Merklein, M.

    2016-11-01

    Global trends like growing environmental awareness and demand for resource efficiency motivate an abandonment of lubricants in metal forming. However, dry forming evokes increased friction and wear. Especially, dry deep drawing of aluminum alloys leads to intensive interaction between tool and workpiece due to its high adhesion tendency. One approach to improve the tribological behavior is the application of carbon based coatings. These coatings are characterized by high wear resistance. In order to investigate the potential of carbon based coatings for dry deep drawing, friction and wear behavior of different coating compositions are evaluated in strip drawing tests. This setup is used to model the tribological conditions in the flange area of deep drawing operations. The tribological behavior of tetrahedral amorphous (ta-C) and hydrogenated amorphous carbon coatings with and without tungsten modification (a-C:H:W, a-C:H) is investigated. The influence of tool topography is analyzed by applying different surface finishing. The results show reduced friction with decreased roughness for coated tools. Besides tool topography the coating type determines the tribological conditions. Smooth tools with ta-C and a-C:H coatings reveal low friction and prevent adhesive wear. In contrast, smooth a-C:H:W coated tools only lead to slight improvement compared to rough, uncoated specimen.

  19. Conceptual Models as Tools for Communication Across Disciplines

    Directory of Open Access Journals (Sweden)

    Marieke Heemskerk

    2003-12-01

    Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.

  20. Aquifer characterization through an integrated GIS-based tool

    Science.gov (United States)

    Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Serrano-Juan, Alejandro; Alcaraz, Mar; García-Gil, Alejandro

    2016-04-01

    Hydraulic parameters of the subsurface (transmissivity, hydraulic conductivity, storativity and specific storage) are important to achieve hydrogeological studies such as environmental impact assessments, water resources evaluations or groundwater contamination remediation, among others. There are several methods to determine aquifer parameters but pumping test is the most commonly used method to obtain them and generally leads to reliable hydraulic parameters. These parameters and other hydraulic data available for integration into the hydrogeological studies (which currently are supported by groundwater numerical models) usually has a very diverse origin and format and, therefore, a chance of bias in the interpretations. Consequently, it becomes necessary to have effective instruments that facilitate the pre-process, the visualization, the analysis and the validation (e.g. graphical analysis techniques) of this great amount of data. To achieve this in a clear and understandable manner, the GIS environment is a useful instrument. We developed a software to analyze pumping tests in a GIS platform environment to support the hydraulic parameterization of groundwater flow and transport models. This novel platform provides a package of tools for collecting, managing, analyzing, processing and interpreting data derived from pumping tests in a GIS environment. Additionally, within the GIS platform, it is possible to process the hydraulic parameters obtained from the pumping test and to create spatial distribution maps, perform geostatistical analysis and export the information to an external software platform. These tools have been applied in the metropolitan area of Barcelona (Spain) to tests out and improve their usefulness in hydrogeological analysis.

  1. A compensation approach to the tool used in autoclave based on FEA

    Institute of Scientific and Technical Information of China (English)

    Zhang Ji; Li Yingguang; Li Nanya; and Liao Wenhe

    2012-01-01

    Optimization of the curing process can not control the deformation of composite part prepared in autoclave accurately. And traditional "trial-and-error" tool surface compensation approach is low efficiency, high cost and can not control part deformation quantificationally. In order to address these issues, tool compensation approach based on FEA is presented. Model of multi-field coupling relationship in autoclave is realized. And finite element analysis model of composite part's curing process is developed to analyze part deformation. According to displacement of the part surface nodes after deformation, tool surface which compensated by the displacement of composite part which analyzed by FEA is used to control part deformation. A cylindrical composite part is ana- lyzed to verify the approach, and the result proves the correctness and validity of the approach.

  2. New Pedagogy for Using Internet-Based Teaching Tools in Physics Course

    CERN Document Server

    Toback, D; Novikova, I; Toback, David; Mershin, Andreas; Novikova, Irina

    2004-01-01

    Acquiring the mathematical, conceptual, and problem-solving skills required in university-level physics courses is hard work, and the average student often lacks the knowledge and study skills they need to succeed in the introductory courses. Here we propose a new pedagogical model and a straight-forwardly reproducible set of internet-based testing tools. Our work to address some of the most important student deficiencies is based on three fundamental principles: balancing skill level and challenge, providing clear goals and feedback at every stage, and allowing repetition without penalty. Our tools include an Automated Mathematics Evaluation System (AMES), a Computerized Homework Assignment Grading System (CHAGS), and a set of after-homework quizzes and mini-practice exams (QUizzes Intended to Consolidate Knowledge, or QUICK). We describe how these tools are incorporated into the course, and present some preliminary results on their effectiveness.

  3. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  4. KINEROS2 – AGWA Suite of Modeling Tools

    Science.gov (United States)

    KINEROS2 (K2) originated in the 1960s as a distributed event-based rainfall-runoff erosion model abstracting the watershed as a cascade of overland flow elements contributing to channel model elements. Development and improvement of K2 has continued for a variety of projects and ...

  5. Using a Parametric Solid Modeler as an Instructional Tool

    Science.gov (United States)

    Devine, Kevin L.

    2008-01-01

    This paper presents the results of a quasi-experimental study that brought 3D constraint-based parametric solid modeling technology into the high school mathematics classroom. This study used two intact groups; a control group and an experimental group, to measure the extent to which using a parametric solid modeler during instruction affects…

  6. Modeling Tool for Decision Support during Early Days of an Anthrax Event

    Science.gov (United States)

    Meltzer, Martin I.; Shadomy, Sean; Bower, William A.; Hupert, Nathaniel

    2017-01-01

    Health officials lack field-implementable tools for forecasting the effects that a large-scale release of Bacillus anthracis spores would have on public health and hospitals. We created a modeling tool (combining inhalational anthrax caseload projections based on initial case reports, effects of variable postexposure prophylaxis campaigns, and healthcare facility surge capacity requirements) to project hospitalizations and casualties from a newly detected inhalation anthrax event, and we examined the consequences of intervention choices. With only 3 days of case counts, the model can predict final attack sizes for simulated Sverdlovsk-like events (1979 USSR) with sufficient accuracy for decision making and confirms the value of early postexposure prophylaxis initiation. According to a baseline scenario, hospital treatment volume peaks 15 days after exposure, deaths peak earlier (day 5), and recovery peaks later (day 23). This tool gives public health, hospital, and emergency planners scenario-specific information for developing quantitative response plans for this threat. PMID:27983505

  7. Computational Tools for Modeling and Measuring Chromosome Structure

    Science.gov (United States)

    Ross, Brian Christopher

    DNA conformation within cells has many important biological implications, but there are challenges both in modeling DNA due to the need for specialized techniques, and experimentally since tracing out in vivo conformations is currently impossible. This thesis contributes two computational projects to these efforts. The first project is a set of online and offline calculators of conformational statistics using a variety of published and unpublished methods, addressing the current lack of DNA model-building tools intended for general use. The second project is a reconstructive analysis that could enable in vivo mapping of DNA conformation at high resolution with current experimental technology. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  8. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    Science.gov (United States)

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  9. Advanced prototyping tools for project- and problem-based learning

    DEFF Research Database (Denmark)

    Teodorescu, Remus; Bech, Michael Møller; Holm, Allan J.

    2002-01-01

    A new approach in prototyping for project- and problem-based learning is achieved by using the new Total Development Environment concept introduced by dSPACE that allows a full visual block-oriented programming of dynamic real-time systems to be achieved  using the Matlab/Simulink environment. A ....... A new laboratory called Flexible Drives System Laboratory (FDSL) as well as a matrix-converter controller which both are using dSPACE prototyping tools are described in this paper.......A new approach in prototyping for project- and problem-based learning is achieved by using the new Total Development Environment concept introduced by dSPACE that allows a full visual block-oriented programming of dynamic real-time systems to be achieved  using the Matlab/Simulink environment...

  10. Advanced prototyping tools for project- and problem-based learning

    DEFF Research Database (Denmark)

    Teodorescu, Remus; Bech, Michael Møller; Holm, Allan J.

    2002-01-01

    A new approach in prototyping for project- and problem-based learning is achieved by using the new Total Development Environment concept introduced by dSPACE that allows a full visual block-oriented programming of dynamic real-time systems to be achieved  using the Matlab/Simulink environment. A ....... A new laboratory called Flexible Drives System Laboratory (FDSL) as well as a matrix-converter controller which both are using dSPACE prototyping tools are described in this paper.......A new approach in prototyping for project- and problem-based learning is achieved by using the new Total Development Environment concept introduced by dSPACE that allows a full visual block-oriented programming of dynamic real-time systems to be achieved  using the Matlab/Simulink environment...

  11. Operation reliability assessment for cutting tools by applying a proportional covariate model to condition monitoring information.

    Science.gov (United States)

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-09-25

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.

  12. Rogeaulito: a world energy scenario modeling tool for transparent energy system thinking

    Directory of Open Access Journals (Sweden)

    Léo eBenichou

    2014-01-01

    Full Text Available Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  13. Program Suite for Conceptual Designing of Parallel Mechanism-Based Robots and Machine Tools

    Directory of Open Access Journals (Sweden)

    Slobodan Tabaković

    2013-06-01

    This paper describes the categorization of criteria for the conceptual design of parallel mechanism‐based robots or machine tools, resulting from workspace analysis as well as the procedure of their defining. Furthermore, it also presents the designing methodology that was implemented into the program for the creation of a robot or machine tool space model and the optimization of the resulting solution. For verification of the criteria and the programme suite, three common (conceptually different mechanisms with a similar mechanical structure and kinematic characteristics were used.

  14. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    Science.gov (United States)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  15. Modeling as a research tool in poultry science.

    Science.gov (United States)

    Gous, R M

    2014-01-01

    The World's Poultry Science Association (WPSA) is a long-established and unique organization that strives to advance knowledge and understanding of all aspects of poultry science and the poultry industry. Its 3 main aims are education, organization, and research. The WPSA Keynote Lecture, titled "Modeling as a research tool in poultry science," addresses 2 of these aims, namely, the value of modeling in research and education. The role of scientists is to put forward and then to test theories. These theories, or models, may be simple or highly complex, but they are aimed at improving our understanding of a system or the interaction between systems. In developing a model, the scientist must take into account existing knowledge, and in this process gaps in our knowledge of a system are identified. Useful ideas for research are generated in this way, and experiments may be designed specifically to address these issues. The resultant models become more accurate and more useful, and can be used in education and extension as a means of explaining many of the complex issues that arise in poultry science.

  16. Introducing BioSARN - an ecological niche model refinement tool.

    Science.gov (United States)

    Heap, Marshall J

    2016-08-01

    Environmental niche modeling outputs a biological species' potential distribution. Further work is needed to arrive at a species' realized distribution. The Biological Species Approximate Realized Niche (BioSARN) application provides the ecological modeler with a toolset to refine Environmental niche models (ENMs). These tools include soil and land class filtering, niche area quantification and novelties like enhanced temporal corridor definition, and output to a high spatial resolution land class model. BioSARN is exemplified with a study on Fraser fir, a tree species with strong land class and edaphic correlations. Soil and land class filtering caused the potential distribution area to decline 17%. Enhanced temporal corridor definition permitted distinction of current, continuing, and future niches, and thus niche change and movement. Tile quantification analysis provided further corroboration of these trends. BioSARN does not substitute other established ENM methods. Rather, it allows the experimenter to work with their preferred ENM, refining it using their knowledge and experience. Output from lower spatial resolution ENMs to a high spatial resolution land class model is a pseudo high-resolution result. Still, it maybe the best that can be achieved until wide range high spatial resolution environmental data and accurate high precision species occurrence data become generally available.

  17. Selecting a risk-based tool to aid in decision making

    Energy Technology Data Exchange (ETDEWEB)

    Bendure, A.O.

    1995-03-01

    Selecting a risk-based tool to aid in decision making is as much of a challenge as properly using the tool once it has been selected. Failure to consider customer and stakeholder requirements and the technical bases and differences in risk-based decision making tools will produce confounding and/or politically unacceptable results when the tool is used. Selecting a risk-based decisionmaking tool must therefore be undertaken with the same, if not greater, rigor than the use of the tool once it is selected. This paper presents a process for selecting a risk-based tool appropriate to a set of prioritization or resource allocation tasks, discusses the results of applying the process to four risk-based decision-making tools, and identifies the ``musts`` for successful selection and implementation of a risk-based tool to aid in decision making.

  18. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  19. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  20. Knowledge-based decision support system for tool management in flexible manufacturing system

    Institute of Scientific and Technical Information of China (English)

    周炳海; 奚立峰; 蔡建国

    2004-01-01

    Tool management is not a single, simple activity, it is comprised of a complex set of functions, especially in a flexible manufacturing system (FMS) environment. The issues associated with tool management include tool requirement planning, tool real-time scheduling, tool crib management, tool inventory control, tool fault diagnosis, tool tracking and tool monitoring. In order to make tools flow into/out of FMS efficiently, this work is aimed to design a knowledge-based decision support system (KBDSS) for tool management in FMS. Firstly an overview of tool management functions is described. Then the structure of KBDSS for tool management and the essential agents in the design of KBDSS are presented. Finally the individual agents of KBDSS are discussed for design and development.

  1. ``Tools for Astrometry": A Windows-based Research Tool for Asteroid Discovery and Measurement

    Science.gov (United States)

    Snyder, G. A.; Marschall, L. A.; Good, R. F.; Hayden, M. B.; Cooper, P. R.

    1998-12-01

    We have developed a Windows-based interactive digital astrometry package with a simple, ergonomic interface, designed for the discovery, measurement, and recording of asteroid positions by individual observers. The software, "Tools For Astrometry", will handle FITS and SBIG format images up to 2048 x 2048 (or larger, depending on RAM), and provides features for blinking images or subframes of images, and measurement of positions and magnitudes against both the HST Guide Star Catalog and the USNO SA-1 catalog,. In addition, the program can calculate ephemerides from element tables, including the Lowell Asteroid Database available online, can generate charts of star-fields showing the motion of asteroids from the ephemeris superimposed against the background star field, can project motions of measured asteroids ahead several days using linear interpolation for purposes of reacquisition, and can calculate projected baselines for asteroid parallax measurements. Images, charts, and tables of ephemerides can printed as well as displayed, and reports can be generated in the standard format of the IAU Minor Planet Center. The software is designed ergonomically, and one can go from raw images to completed astrometric report in a matter of minutes. The software is an extension of software developed for introductory astronomy laboratories by Project CLEA, which is supported by grants from Gettysburg College and the National Science Foundation.

  2. TAGmapper: a web-based tool for mapping SAGE tags.

    Science.gov (United States)

    Bala, P; Georgantas, Robert W; Sudhir, D; Suresh, M; Shanker, K; Vrushabendra, B M; Civin, Curt I; Pandey, Akhilesh

    2005-12-30

    Serial Analysis of Gene Expression (SAGE) is an important means of obtaining quantitative information about expression of genes in different samples. Short SAGE tags are 10 nucleotides long and often contain enough information to uniquely identify the gene(s) corresponding to the tag. We have observed, however, that the currently available resources are not adequate for accurate mapping of all SAGE tags to genes. Here, we describe development of a web-based tool called TAGmapper (http://tagmapper.ibioinformatics.org), which provides a comprehensive and accurate mapping of SAGE tags to genes. We were able to map SAGE tags accurately in several instances where two other popular resources, SAGEmap (http://www.ncbi.nlm.nih.gov/projects/SAGE/) and SAGE Genie (http://cgap.nci.nih.gov/SAGE), provided incorrect or no assignment of tags to genes. Finally, we experimentally determined the expression of a subset of genes assigned by TAGmapper using DNA microarrays and/or quantitative PCR to confirm the reliability of the gene mappings. We anticipate that TAGmapper will be a useful tool in functional genomic approaches by providing accurate identification of genes in SAGE experiments.

  3. A Mobile Network Planning Tool Based on Data Analytics

    Directory of Open Access Journals (Sweden)

    Jessica Moysen

    2017-01-01

    Full Text Available Planning future mobile networks entails multiple challenges due to the high complexity of the network to be managed. Beyond 4G and 5G networks are expected to be characterized by a high densification of nodes and heterogeneity of layers, applications, and Radio Access Technologies (RAT. In this context, a network planning tool capable of dealing with this complexity is highly convenient. The objective is to exploit the information produced by and already available in the network to properly deploy, configure, and optimise network nodes. This work presents such a smart network planning tool that exploits Machine Learning (ML techniques. The proposed approach is able to predict the Quality of Service (QoS experienced by the users based on the measurement history of the network. We select Physical Resource Block (PRB per Megabit (Mb as our main QoS indicator to optimise, since minimizing this metric allows offering the same service to users by consuming less resources, so, being more cost-effective. Two cases of study are considered in order to evaluate the performance of the proposed scheme, one to smartly plan the small cell deployment in a dense indoor scenario and a second one to timely face a detected fault in a macrocell network.

  4. KNOWLEDGE MANAGEMENT TOOLS FOR THE EUROPEAN KNOWLEDGE BASED SOCIETY

    Directory of Open Access Journals (Sweden)

    Ramona – Diana Leon

    2011-12-01

    Full Text Available Increasingly more literature mention that in the current competitive environment, knowledge have become the main source of the competitive advantages, while recent researches regarding economic growth and development have defined knowledge as being the most critical resource of the emerging countries.Therefore, the organizations interest for knowledge has increased, the latter being defined as knowledge management process in order to meet existing needs, to identify and exploit existing and/or acquired knowledge and developing new opportunities.In other words, knowledge management facilitates productive information usage, intelligence growth, storing intellectual capital, strategic planning, flexible acquisition, collection of best practices, increasing the likelihood of being successful as well as a more productive collaboration within the company.In order to benefit from all these advantages, it is required the usage of specific tools including models and systems to stimulate the creation, dissemination and use of knowledge held by each employee and the organization as a whole.

  5. A Visualization-Based Tutoring Tool for Engineering Education

    Science.gov (United States)

    Nguyen, Tang-Hung; Khoo, I.-Hung

    2010-06-01

    In engineering disciplines, students usually have hard time to visualize different aspects of engineering analysis and design, which inherently are too complex or abstract to fully understand without the aid of visual explanations or visualizations. As examples, when learning materials and sequences of construction process, students need to visualize how all components of a constructed facility are assembled? Such visualization can not be achieved in a textbook and a traditional lecturing environment. In this paper, the authors present the development of a computer tutoring software, in which different visualization tools including video clips, 3 dimensional models, drawings, pictures/photos together with complementary texts are used to assist students in deeply understanding and effectively mastering materials. The paper will also discuss the implementation and the effectiveness evaluation of the proposed tutoring software, which was used to teach a construction engineering management course offered at California State University, Long Beach.

  6. DISQOVER the Landcover - R based tools for quantitative vegetation reconstruction

    Science.gov (United States)

    Theuerkauf, Martin; Couwenberg, John; Kuparinen, Anna; Liebscher, Volkmar

    2016-04-01

    Quantitative methods have gained increasing attention in the field of vegetation reconstruction over the past decade. The DISQOVER package implements key tools in the R programming environment for statistical computing. This implementation has three main goals: 1) Provide a user-friendly, transparent, and open implementation of the methods 2) Provide full flexibility in all parameters (including the underlying pollen dispersal model) 3) Provide a sandbox for testing the sensitivity of the methods. We illustrate the possibilities of the package with tests of the REVEALS model and of the extended downscaling approach (EDA). REVEALS (Sugita 2007) is designed to translate pollen data from large lakes into regional vegetation composition. We applied REVEALSinR on pollen data from Lake Tiefer See (NE-Germany) and validated the results with historic landcover data. The results clearly show that REVEALS is sensitive to the underlying pollen dispersal model; REVEALS performs best when applied with the state of the art Lagrangian stochastic dispersal model. REVEALS applications with the conventional Gauss model can produce realistic results, but only if unrealistic pollen productivity estimates are used. The EDA (Theuerkauf et al. 2014) employs pollen data from many sites across a landscape to explore whether species distributions in the past were related to know stable patterns in the landscape, e.g. the distribution of soil types. The approach had so far only been implemented in simple settings with few taxa. Tests with EDAinR show that it produces sharp results in complex settings with many taxa as well. The DISQOVER package is open source software, available from disqover.uni-greifswald.de. This website can be used as a platform to discuss and improve quantitative methods in vegetation reconstruction. To introduce the tool we plan a short course in autumn of this year. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution

  7. A Method to Optimize Geometric Errors of Machine Tool based on SNR Quality Loss Function and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Cai Ligang

    2017-01-01

    Full Text Available Instead improving the accuracy of machine tool by increasing the precision of key components level blindly in the production process, the method of combination of SNR quality loss function and machine tool geometric error correlation analysis to optimize five-axis machine tool geometric errors will be adopted. Firstly, the homogeneous transformation matrix method will be used to build five-axis machine tool geometric error modeling. Secondly, the SNR quality loss function will be used for cost modeling. And then, machine tool accuracy optimal objective function will be established based on the correlation analysis. Finally, ISIGHT combined with MATLAB will be applied to optimize each error. The results show that this method is reasonable and appropriate to relax the range of tolerance values, so as to reduce the manufacturing cost of machine tools.

  8. Astronomical Data Fusion Tool Based on PostgreSQL

    CERN Document Server

    Han, Bo; Zhong, Shoubo; Zhao, Yongheng

    2016-01-01

    With the application of advanced astronomical technologies, equipments and methods all over the world, astronomy covers from radio, infrared, visible light, ultraviolet, X-ray and gamma ray band, and enters into the era of full wavelength astronomy. How to effectively integrate data from different ground- and space-based observation equipments, different observers, different bands, different observation time, requires the data fusion technology. In this paper we introduce the cross-match tool that is developed by the Python language and based on the PostgreSQL database and uses Q3C as the core index, facilitating the cross-match work of massive astronomical data. It provides four different cross-match functions, namely: I) cross-match of custom error range; II) cross-match of catalog error; III) cross-match based on the elliptic error range; IV) cross-match of the nearest algorithm. The cross-match result set provides good foundation for subsequent data mining and statistics based on multiwavelength data. The...

  9. Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool

    Institute of Scientific and Technical Information of China (English)

    Qianjian GUO; Shuo FAN; Rufeng XU; Xiang CHENG; Guoyong ZHAO; Jianguo YANG

    2017-01-01

    Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools,spindle thermal error measurement,modeling and compensation of a two turntable five-axis machine tool are researched.Measurement experiment of heat sources and thermal errors are carried out,and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling.In order to analyze the influence of different heat sources on spindle thermal errors,an ANN (artificial neural network) model is presented,and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN,a new ABCNN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors.In order to test the prediction performance of ABC-NN model,an experiment system is developed,the prediction results of LSR (least squares regression),ANN and ABC-NN are compared with the measurement results of spindle thermal errors.Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN,and the residual error is smaller than 3 μm,the new modeling method is feasible.The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.

  10. Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool

    Science.gov (United States)

    Guo, Qianjian; Fan, Shuo; Xu, Rufeng; Cheng, Xiang; Zhao, Guoyong; Yang, Jianguo

    2017-03-01

    Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC-NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 μm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.

  11. Cost Benefit Analysis Modeling Tool for Electric vs. ICE Airport Ground Support Equipment – Development and Results

    Energy Technology Data Exchange (ETDEWEB)

    James Francfort; Kevin Morrow; Dimitri Hochard

    2007-02-01

    This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.

  12. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  13. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  14. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  15. TENTube: A Video-based Connection Tool Supporting Competence Development

    Directory of Open Access Journals (Sweden)

    Albert A Angehrn

    2008-07-01

    Full Text Available The vast majority of knowledge management initiatives fail because they do not take sufficiently into account the emotional, psychological and social needs of individuals. Only if users see real value for themselves will they actively use and contribute their own knowledge to the system, and engage with other users. Connection dynamics can make this easier, and even enjoyable, by connecting people and bringing them closer through shared experiences such as playing a game together. A higher connectedness of people to other people, and to relevant knowledge assets, will motivate them to participate more actively and increase system usage. In this paper, we describe the design of TENTube, a video-based connection tool we are developing to support competence development. TENTube integrates rich profiling and network visualization and navigation with agent-enhanced game-like connection dynamics.

  16. Facebook as a tool to Enhance Team Based Learning

    Directory of Open Access Journals (Sweden)

    Sami M. Alhomod

    2013-01-01

    Full Text Available A growing number of educators are using social networking sites (SNS to communicate with their students. Facebook is one such example which is widely used by students and educators. Facebook has been recently used by many educational institutions but most of these have been related to provide the information to the general audience. There has not been much study done to propose Facebook as an educational tool in a classroom scenario. The aim of this paper is to propose the idea of using Facebook in team based learning (TBL scenario. The paper demonstrates the use of Facebook at each level of TBL The paper shows how Facebook can be used by students and teacher to communicate with each other in a TBL system. The paper also explains teacher – team and teacher –student communication via Facebook.

  17. Tools for evaluating team performance in simulation-based training.

    Science.gov (United States)

    Rosen, Michael A; Weaver, Sallie J; Lazzara, Elizabeth H; Salas, Eduardo; Wu, Teresa; Silvestri, Salvatore; Schiebel, Nicola; Almeida, Sandra; King, Heidi B

    2010-10-01

    Teamwork training constitutes one of the core approaches for moving healthcare systems toward increased levels of quality and safety, and simulation provides a powerful method of delivering this training, especially for face-paced and dynamic specialty areas such as Emergency Medicine. Team performance measurement and evaluation plays an integral role in ensuring that simulation-based training for teams (SBTT) is systematic and effective. However, this component of SBTT systems is overlooked frequently. This article addresses this gap by providing a review and practical introduction to the process of developing and implementing evaluation systems in SBTT. First, an overview of team performance evaluation is provided. Second, best practices for measuring team performance in simulation are reviewed. Third, some of the prominent measurement tools in the literature are summarized and discussed relative to the best practices. Subsequently, implications of the review are discussed for the practice of training teamwork in Emergency Medicine.

  18. A JAVA-based multimedia tool for clinical practice guidelines.

    Science.gov (United States)

    Maojo, V; Herrero, C; Valenzuela, F; Crespo, J; Lazaro, P; Pazos, A

    1997-01-01

    We have developed a specific language for the representation of Clinical Practice Guidelines (CPGs) and Windows C++ and platform independent JAVA applications for multimedia presentation and edition of electronically stored CPGs. This approach facilitates translation of guidelines and protocols from paper to computer-based flowchart representations. Users can navigate through the algorithm with a friendly user interface and access related multimedia information within the context of each clinical problem. CPGs can be stored in a computer server and distributed over the World Wide Web, facilitating dissemination, local adaptation, and use as a reference element in medical care. We have chosen the Agency for Health Care and Policy Research's heart failure guideline to demonstrate the capabilities of our tool.

  19. Tools for evaluating team performance in simulation-based training

    Science.gov (United States)

    Rosen, Michael A; Weaver, Sallie J; Lazzara, Elizabeth H; Salas, Eduardo; Wu, Teresa; Silvestri, Salvatore; Schiebel, Nicola; Almeida, Sandra; King, Heidi B

    2010-01-01

    Teamwork training constitutes one of the core approaches for moving healthcare systems toward increased levels of quality and safety, and simulation provides a powerful method of delivering this training, especially for face-paced and dynamic specialty areas such as Emergency Medicine. Team performance measurement and evaluation plays an integral role in ensuring that simulation-based training for teams (SBTT) is systematic and effective. However, this component of SBTT systems is overlooked frequently. This article addresses this gap by providing a review and practical introduction to the process of developing and implementing evaluation systems in SBTT. First, an overview of team performance evaluation is provided. Second, best practices for measuring team performance in simulation are reviewed. Third, some of the prominent measurement tools in the literature are summarized and discussed relative to the best practices. Subsequently, implications of the review are discussed for the practice of training teamwork in Emergency Medicine. PMID:21063558

  20. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  1. An interdisciplinary framework for participatory modeling design and evaluation—What makes models effective participatory decision tools?

    Science.gov (United States)

    Falconi, Stefanie M.; Palmer, Richard N.

    2017-02-01

    Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.

  2. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  3. Genomic-based-breeding tools for tropical maize improvement.

    Science.gov (United States)

    Chakradhar, Thammineni; Hindu, Vemuri; Reddy, Palakolanu Sudhakar

    2017-09-05

    Maize has traditionally been the main staple diet in the Southern Asia and Sub-Saharan Africa and widely grown by millions of resource poor small scale farmers. Approximately, 35.4 million hectares are sown to tropical maize, constituting around 59% of the developing worlds. Tropical maize encounters tremendous challenges besides poor agro-climatic situations with average yields recorded <3 tones/hectare that is far less than the average of developed countries. On the contrary to poor yields, the demand for maize as food, feed, and fuel is continuously increasing in these regions. Heterosis breeding introduced in early 90 s improved maize yields significantly, but genetic gains is still a mirage, particularly for crop growing under marginal environments. Application of molecular markers has accelerated the pace of maize breeding to some extent. The availability of array of sequencing and genotyping technologies offers unrivalled service to improve precision in maize-breeding programs through modern approaches such as genomic selection, genome-wide association studies, bulk segregant analysis-based sequencing approaches, etc. Superior alleles underlying complex traits can easily be identified and introgressed efficiently using these sequence-based approaches. Integration of genomic tools and techniques with advanced genetic resources such as nested association mapping and backcross nested association mapping could certainly address the genetic issues in maize improvement programs in developing countries. Huge diversity in tropical maize and its inherent capacity for doubled haploid technology offers advantage to apply the next generation genomic tools for accelerating production in marginal environments of tropical and subtropical world. Precision in phenotyping is the key for success of any molecular-breeding approach. This article reviews genomic technologies and their application to improve agronomic traits in tropical maize breeding has been reviewed in

  4. Planning the network of gas pipelines through modeling tools

    Energy Technology Data Exchange (ETDEWEB)

    Sucupira, Marcos L.L.; Lutif Filho, Raimundo B. [Companhia de Gas do Ceara (CEGAS), Fortaleza, CE (Brazil)

    2009-07-01

    Natural gas is a source of non-renewable energy used by different sectors of the economy of Ceara. Its use may be industrial, residential, commercial, as a source of automotive fuel, as a co-generation of energy and as a source for generating electricity from heat. For its practicality this energy has a strong market acceptance and provides a broad list of clients to fit their use, which makes it possible to reach diverse parts of the city. Its distribution requires a complex network of pipelines that branches throughout the city to meet all potential clients interested in this source of energy. To facilitate the design, analysis, expansion and location of bottlenecks and breaks in the distribution network, a modeling software is used that allows the network manager of the net to manage the various information about the network. This paper presents the advantages of modeling the gas distribution network of natural gas companies in Ceara, showing the tool used, the steps necessary for the implementation of the models, the advantages of using the software and the findings obtained with its use. (author)

  5. Prediction of Surface Roughness Based on Machining Condition and Tool Condition in Boring EN31 Steel

    Directory of Open Access Journals (Sweden)

    P. Mohanaraman

    2016-04-01

    Full Text Available Prediction of Surface roughness plays a vital role in manufacturing process. In manufacturing industries, productions of metallic materials require high surface finish in various components. In the present work, the effect of spindle speed, feed rate, depth of cut and flank wear of the tool on the surface roughness has been studied. Carbide tipped insert was used for boring operation. Experiments were conducted in CNC lathe. The experimental setup was prepared with sixteen levels of cutting parameters and was conducted with two tool tip conditions in dry machining. A piezoelectric accelerometer was used to measure the vibrational signals while machining. The data acquisition card which connected between accelerometer and lab-view software to record the signals. Simple linear and least median regression models were used for prediction of surface roughness. The models were developed by weka analysis software. The best suitable regression model is implemented based on maximum correlation coefficient and the minimum error values.

  6. An Integrated Simulation Tool for Modeling the Human Circulatory System

    Science.gov (United States)

    Asami, Ken'ichi; Kitamura, Tadashi

    This paper presents an integrated simulation of the circulatory system in physiological movement. The large circulatory system model includes principal organs and functional units in modules in which comprehensive physiological changes such as nerve reflexes, temperature regulation, acid/base balance, O2/CO2 balance, and exercise are simulated. A beat-by-beat heart model, in which the corresponding electrical circuit problems are solved by a numerical analytic method, enables calculation of pulsatile blood flow to the major organs. The integration of different perspectives on physiological changes makes this simulation model applicable for the microscopic evaluation of blood flow under various conditions in the human body.

  7. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  8. Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool.

    Science.gov (United States)

    Long, JoAnn D; Gannaway, Paula; Ford, Cindy; Doumit, Rita; Zeeni, Nadine; Sukkarieh-Haraty, Ola; Milane, Aline; Byers, Beverly; Harrison, LaNell; Hatch, Daniel; Brown, Justin; Proper, Sharlan; White, Patricia; Song, Huaxin

    2016-02-01

    As the world becomes increasingly digital, advances in technology have changed how students access evidence-based information. Research suggests that students overestimate their ability to locate quality online research and lack the skills needed to evaluate the scientific literature. Clinical nurses report relying on personal experience to answer clinical questions rather than searching evidence-based sources. To address the problem, a web-based, evidence-based research (EBR) tool that is usable from a computer, smartphone, or iPad was developed and tested. The purpose of the EBR tool is to guide students through the basic steps needed to locate and critically appraise the online scientific literature while linking users to quality electronic resources to support evidence-based practice (EBP). Testing of the tool took place in a mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design in a U.S. and Middle East university. A statistically significant improvement in overall research skills was supported in the quasi-experimental nursing student group and RCT nutrition student group using the EBR tool. A statistically significant proportional difference was supported in the RCT nutrition and PharmD intervention groups in participants' ability to distinguish the credibility of online source materials compared with controls. The majority of participants could correctly apply PICOTS to a case study when using the tool. The data from this preliminary study suggests that the EBR tool enhanced student overall research skills and selected EBP skills while generating data for assessment of learning outcomes. The EBR tool places evidence-based resources at the fingertips of users by addressing some of the most commonly cited barriers to research utilization while exposing users to information and online literacy standards of practice, meeting a growing need within nursing curricula. © 2016 Sigma Theta Tau International.

  9. Simulation of surface topography of big aspheric fabrication by ultra-precision diamond turning based on tool swing feeding

    Science.gov (United States)

    Yao, Honghui; Li, Zengqiang; Sun, Tao

    2014-08-01

    In the respect of ultra-precision manufacturing of axisymmetric surface, the machine tool with tool swing feeding which has less interpolation error sources compared to the conventional ultra-precision diamond turning machine tool with T-structureis worth studying.Therefore,based on the dynamic simulation modeling and multi-body dynamics theory,in this paper, we establish the control model,and tool path for Ultra-precision machine.Then we got the model for surface topography with differentinput parameters like spindle speed, feedrate, tool parameters and so on. Taking the spherical optics part with diameter of 300 mm, for example, we input the process parameters and get its surface topography, then evaluate its surface quality by surface roughness value (Ra) and surface shape accuracy(PV) .

  10. Implementing an HL7 version 3 modeling tool from an Ecore model.

    Science.gov (United States)

    Bánfai, Balázs; Ulrich, Brandon; Török, Zsolt; Natarajan, Ravi; Ireland, Tim

    2009-01-01

    One of the main challenges of achieving interoperability using the HL7 V3 healthcare standard is the lack of clear definition and supporting tools for modeling, testing, and conformance checking. Currently, the knowledge defining the modeling is scattered around in MIF schemas, tools and specifications or simply with the domain experts. Modeling core HL7 concepts, constraints, and semantic relationships in Ecore/EMF encapsulates the domain-specific knowledge in a transparent way while unifying Java, XML, and UML in an abstract, high-level representation. Moreover, persisting and versioning the core HL7 concepts as a single Ecore context allows modelers and implementers to create, edit and validate message models against a single modeling context. The solution discussed in this paper is implemented in the new HL7 Static Model Designer as an extensible toolset integrated as a standalone Eclipse RCP application.

  11. A Hierarchical Slicing Tool Model%一个分层切片工具模型

    Institute of Scientific and Technical Information of China (English)

    谭毅; 朱平; 李必信; 郑国梁

    2001-01-01

    Most of the traditional methods of slicing are based on dependence graph. But constructing dependence graph for object oriented programs directly is very complicated. The design and implementation of a hierarchical slicing tool model are described. By constructing the package level dependence graph, class level dependence graph, method level dependence graph and statement level dependence graph, package level slice, class level slice, method level slice and program slice are obtained step by step.

  12. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  13. Automation of Global Adjoint Tomography Based on ASDF and Workflow Management Tools

    Science.gov (United States)

    Lei, W.; Ruan, Y.; Bozdag, E.; Smith, J. A.; Modrak, R. T.; Krischer, L.; Chen, Y.; Lefebvre, M. P.; Tromp, J.

    2016-12-01

    Global adjoint tomography is computationally expensive, requiring thousands of wavefield simulations and massive data processing. Though a collaboration with the Oak Ridge National Laboratory computing group and an allocation on the `Titan' GPU-accelerated supercomputer, we have begun to assimilate waveform data from more than 4,000 earthquakes, from 1995 to 2015, in our inversions. However, since conventional file formats and signal processing tools were not designed for parallel processing of massive data volumes, use of such tools in high-resolution global inversions leads to major bottlenecks. To overcome such problems and allow for continued scientific progress, we designed the Adaptive Seismic Data Format (ASDF) and developed a set of processing tools based on ASDF, covering from signal processing (pytomo3d), time window selection (pyflex) to adjoint source (pyadjoint). These new tools greatly enhance the reproducibility and accountability of our research while taking full advantage of parallel computing, showing superior scaling on modern computational platforms. The entire inversion workflow, intrinsically complex and sensitive to human errors, is carefully handled and automated by modern workflow management tools, preventing data contamination and saving a huge amount of time. Our starting model GLAD-M15 (Bozdag et al., 2016), an elastic model with transversely isotropic upper mantle, is based on 253 earthquakes and 15 nonlinear conjugate gradient iterations. We have now completed source inversions for more than 1,000 earthquakes and have started structural inversions using a quasi-Newton optimization algorithm. We will discuss the challenges of large-scale workflows on HPC systems, the solutions offered by our new adjoint tomography tools, and the initial tomographic results obtained using the new expanded dataset.

  14. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  15. Large animal models of atherosclerosis--new tools for persistent problems in cardiovascular medicine.

    Science.gov (United States)

    Shim, J; Al-Mashhadi, R H; Sørensen, C B; Bentzon, J F

    2016-01-01

    Coronary heart disease and ischaemic stroke caused by atherosclerosis are leading causes of illness and death worldwide. Small animal models have provided insight into the fundamental mechanisms driving early atherosclerosis, but it is increasingly clear that new strategies and research tools are needed to translate these discoveries into improved prevention and treatment of symptomatic atherosclerosis in humans. Key challenges include better understanding of processes in late atherosclerosis, factors affecting atherosclerosis in the coronary bed, and the development of reliable imaging biomarker tools for risk stratification and monitoring of drug effects in humans. Efficient large animal models of atherosclerosis may help tackle these problems. Recent years have seen tremendous advances in gene-editing tools for large animals. This has made it possible to create gene-modified minipigs that develop atherosclerosis with many similarities to humans in terms of predilection for lesion sites and histopathology. Together with existing porcine models of atherosclerosis that are based on spontaneous mutations or severe diabetes, such models open new avenues for translational research in atherosclerosis. In this review, we discuss the merits of different animal models of atherosclerosis and give examples of important research problems where porcine models could prove pivotal for progress.

  16. Mathematical modelling: a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B.

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  17. Mathematical modelling : a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, H; Hellriegel, B

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  18. Mathematical modelling: a tool for hospital infection control.

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  19. Conversion of Rapid Prototyping Models into Metallic Tools by Ceramic Moulding—an Indirect Rapid Tooling Process

    Institute of Scientific and Technical Information of China (English)

    Teresa; P; DUARTE; J; M; FERREIRA; F; Jorge; LINO; A; BARBEDO; Rui; NETO

    2002-01-01

    A process to convert models made by rapid prototypi ng techniques like SL (stereolitography) and LOM (laminated object manufacturing) or by conventional techniques (silicones, resins, wax, etc.) into metallic mould s or tools has been developed. The main purpose of this technique is to rapidly obtain the first prototypes of parts, for plastics injection, forging or any oth er manufacturing process using the tools produced by casting a metal into a cera mic mould. Briefly, it can be said that the ceramic...

  20. Port-based modeling of mechatronic systems

    NARCIS (Netherlands)

    Breedveld, Peter C.

    2004-01-01

    Many engineering activities, including mechatronic design, require that a multidomain or ‘multi-physics’ system and its control system be designed as an integrated system. This contribution discusses the background and tools for a port-based approach to integrated modeling and simulation of physical

  1. Homology Modeling a Fast Tool for Drug Discovery: Current Perspectives

    Science.gov (United States)

    Vyas, V. K.; Ukawala, R. D.; Ghate, M.; Chintha, C.

    2012-01-01

    Major goal of structural biology involve formation of protein-ligand complexes; in which the protein molecules act energetically in the course of binding. Therefore, perceptive of protein-ligand interaction will be very important for structure based drug design. Lack of knowledge of 3D structures has hindered efforts to understand the binding specificities of ligands with protein. With increasing in modeling software and the growing number of known protein structures, homology modeling is rapidly becoming the method of choice for obtaining 3D coordinates of proteins. Homology modeling is a representation of the similarity of environmental residues at topologically corresponding positions in the reference proteins. In the absence of experimental data, model building on the basis of a known 3D structure of a homologous protein is at present the only reliable method to obtain the structural information. Knowledge of the 3D structures of proteins provides invaluable insights into the molecular basis of their functions. The recent advances in homology modeling, particularly in detecting and aligning sequences with template structures, distant homologues, modeling of loops and side chains as well as detecting errors in a model contributed to consistent prediction of protein structure, which was not possible even several years ago. This review focused on the features and a role of homology modeling in predicting protein structure and described current developments in this field with victorious applications at the different stages of the drug design and discovery. PMID:23204616

  2. M4AST - A Tool for Asteroid Modelling

    Science.gov (United States)

    Birlan, Mirel; Popescu, Marcel; Irimiea, Lucian; Binzel, Richard

    2016-10-01

    M4AST (Modelling for asteroids) is an online tool devoted to the analysis and interpretation of reflection spectra of asteroids in the visible and near-infrared spectral intervals. It consists into a spectral database of individual objects and a set of routines for analysis which address scientific aspects such as: taxonomy, curve matching with laboratory spectra, space weathering models, and mineralogical diagnosis. Spectral data were obtained using groundbased facilities; part of these data are precompiled from the literature[1].The database is composed by permanent and temporary files. Each permanent file contains a header and two or three columns (wavelength, spectral reflectance, and the error on spectral reflectance). Temporary files can be uploaded anonymously, and are purged for the property of submitted data. The computing routines are organized in order to accomplish several scientific objectives: visualize spectra, compute the asteroid taxonomic class, compare an asteroid spectrum with similar spectra of meteorites, and computing mineralogical parameters. One facility of using the Virtual Observatory protocols was also developed.A new version of the service was released in June 2016. This new release of M4AST contains a database and facilities to model more than 6,000 spectra of asteroids. A new web-interface was designed. This development allows new functionalities into a user-friendly environment. A bridge system of access and exploiting the database SMASS-MIT (http://smass.mit.edu) allows the treatment and analysis of these data in the framework of M4AST environment.Reference:[1] M. Popescu, M. Birlan, and D.A. Nedelcu, "Modeling of asteroids: M4AST," Astronomy & Astrophysics 544, EDP Sciences, pp. A130, 2012.

  3. Web based educational tool for neural network robot control

    Directory of Open Access Journals (Sweden)

    Jure Čas

    2007-05-01

    Full Text Available Abstract— This paper describes the application for teleoperations of the SCARA robot via the internet. The SCARA robot is used by students of mehatronics at the University of Maribor as a remote educational tool. The developed software consists of two parts i.e. the continuous neural network sliding mode controller (CNNSMC and the graphical user interface (GUI. Application is based on two well-known commercially available software packages i.e. MATLAB/Simulink and LabVIEW. Matlab/Simulink and the DSP2 Library for Simulink are used for control algorithm development, simulation and executable code generation. While this code is executing on the DSP-2 Roby controller and through the analog and digital I/O lines drives the real process, LabVIEW virtual instrument (VI, running on the PC, is used as a user front end. LabVIEW VI provides the ability for on-line parameter tuning, signal monitoring, on-line analysis and via Remote Panels technology also teleoperation. The main advantage of a CNNSMC is the exploitation of its self-learning capability. When friction or an unexpected impediment occurs for example, the user of a remote application has no information about any changed robot dynamic and thus is unable to dispatch it manually. This is not a control problem anymore because, when a CNNSMC is used, any approximation of changed robot dynamic is estimated independently of the remote’s user. Index Terms—LabVIEW; Matlab/Simulink; Neural network control; remote educational tool; robotics

  4. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  5. An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation

    Science.gov (United States)

    Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi

    2015-04-01

    Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).

  6. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  7. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  8. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    Science.gov (United States)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  9. Dynamic modelling and analysis of biochemical networks: mechanism-based models and model-based experiments.

    Science.gov (United States)

    van Riel, Natal A W

    2006-12-01

    Systems biology applies quantitative, mechanistic modelling to study genetic networks, signal transduction pathways and metabolic networks. Mathematical models of biochemical networks can look very different. An important reason is that the purpose and application of a model are essential for the selection of the best mathematical framework. Fundamental aspects of selecting an appropriate modelling framework and a strategy for model building are discussed. Concepts and methods from system and control theory provide a sound basis for the further development of improved and dedicated computational tools for systems biology. Identification of the network components and rate constants that are most critical to the output behaviour of the system is one of the major problems raised in systems biology. Current approaches and methods of parameter sensitivity analysis and parameter estimation are reviewed. It is shown how these methods can be applied in the design of model-based experiments which iteratively yield models that are decreasingly wrong and increasingly gain predictive power.

  10. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    Science.gov (United States)

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  11. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    Directory of Open Access Journals (Sweden)

    Huu-Tho Nguyen

    Full Text Available Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process and a fuzzy COmplex PRoportional ASsessment (COPRAS for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  12. The Business Model Evaluation Tool for Smart Cities: Application to SmartSantander Use Cases

    Directory of Open Access Journals (Sweden)

    Raimundo Díaz-Díaz

    2017-02-01

    Full Text Available New technologies open up the door to multiple business models applied to public services in smart cities. However, there is not a commonly adopted methodology for evaluating business models in smart cities that can help both practitioners and researchers to choose the best option. This paper addresses this gap introducing the Business Model Evaluation Tool for Smart Cities. This methodology is a simple, organized, flexible and the transparent system that facilitates the work of the evaluators of potential business models. It is useful to compare two or more business models and take strategic decisions promptly. The method is part of a previous process of content analysis and it is based on the widely utilized Business Model Canvas. The evaluation method has been assessed by 11 experts and, subsequently it has been validated applying it to the case studies of Santander’s waste management and street lighting systems, which take advantage of innovative technologies commonly used in smart cities.

  13. Designing and Implementing Web-Based Scaffolding Tools for Technology-Enhanced Socioscientific Inquiry

    Science.gov (United States)

    Shin, Suhkyung; Brush, Thomas A.; Glazewski, Krista D.

    2017-01-01

    This study explores how web-based scaffolding tools provide instructional support while implementing a socio-scientific inquiry (SSI) unit in a science classroom. This case study focused on how students used web-based scaffolding tools during SSI activities, and how students perceived the SSI unit and the scaffolding tools embedded in the SSI…

  14. DINAMO: a coupled sequence alignment editor/molecular graphics tool for interactive homology modeling of proteins.

    Science.gov (United States)

    Hansen, M; Bentz, J; Baucom, A; Gregoret, L

    1998-01-01

    Gaining functional information about a novel protein is a universal problem in biomedical research. With the explosive growth of the protein sequence and structural databases, it is becoming increasingly common for researchers to attempt to build a three-dimensional model of their protein of interest in order to gain information about its structure and interactions with other molecules. The two most reliable methods for predicting the structure of a protein are homology modeling, in which the novel sequence is modeled on the known three-dimensional structure of a related protein, and fold recognition (threading), where the sequence is scored against a library of fold models, and the highest scoring model is selected. The sequence alignment to a known structure can be ambiguous, and human intervention is often required to optimize the model. We describe an interactive model building and assessment tool in which a sequence alignment editor is dynamically coupled to a molecular graphics display. By means of a set of assessment tools, the user may optimize his or her alignment to satisfy the known heuristics of protein structure. Adjustments to the sequence alignment made by the user are reflected in the displayed model by color and other visual cues. For instance, residues are colored by hydrophobicity in both the three-dimensional model and in the sequence alignment. This aids the user in identifying undesirable buried polar residues. Several different evaluation metrics may be selected including residue conservation, residue properties, and visualization of predicted secondary structure. These characteristics may be mapped to the model both singly and in combination. DINAMO is a Java-based tool that may be run either over the web or installed locally. Its modular architecture also allows Java-literate users to add plug-ins of their own design.

  15. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    DEFF Research Database (Denmark)

    H. Hjort, Ulrik; Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2009-01-01

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates...

  16. A planning quality evaluation tool for prostate adaptive IMRT based on machine learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Xiaofeng; Ge Yaorong; Li Taoran; Thongphiew, Danthai; Yin Fangfang; Wu, Q Jackie [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27708 (United States); Department of Biomedical Engineering, Wake Forest University Health Sciences, Medical Center Boulevard, Winston-Salem, North Carolina 27106 (United States); Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27708 (United States); Department of Radiation Oncology, Brody School of Medicine, East Carolina University, Greenville, North Carolina 27834 (United States); Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27708 (United States)

    2011-02-15

    Purpose: To ensure plan quality for adaptive IMRT of the prostate, we developed a quantitative evaluation tool using a machine learning approach. This tool generates dose volume histograms (DVHs) of organs-at-risk (OARs) based on prior plans as a reference, to be compared with the adaptive plan derived from fluence map deformation. Methods: Under the same configuration using seven-field 15 MV photon beams, DVHs of OARs (bladder and rectum) were estimated based on anatomical information of the patient and a model learned from a database of high quality prior plans. In this study, the anatomical information was characterized by the organ volumes and distance-to-target histogram (DTH). The database consists of 198 high quality prostate plans and was validated with 14 cases outside the training pool. Principal component analysis (PCA) was applied to DVHs and DTHs to quantify their salient features. Then, support vector regression (SVR) was implemented to establish the correlation between the features of the DVH and the anatomical information. Results: DVH/DTH curves could be characterized sufficiently just using only two or three truncated principal components, thus, patient anatomical information was quantified with reduced numbers of variables. The evaluation of the model using the test data set demonstrated its accuracy {approx}80% in prediction and effectiveness in improving ART planning quality. Conclusions: An adaptive IMRT plan quality evaluation tool based on machine learning has been developed, which estimates OAR sparing and provides reference in evaluating ART.

  17. Petri net-based scheduling of time constrained single-arm cluster tools with wafer revisiting

    Directory of Open Access Journals (Sweden)

    ZiCheng Liu

    2016-05-01

    Full Text Available It is very difficult to schedule a single-arm cluster tool with wafer revisiting such that wafer residency time constraints are satisfied. This article conducts a study on this challenging problem for a single-arm cluster tool with atomic layer deposition process. With a so-called p-backward strategy being applied, a Petri net model is developed to describe the dynamic behavior of the system. Based on the model, the existence of a feasible schedule is analyzed, schedulability conditions are derived, and scheduling algorithms are presented if there is a schedule. A schedule is obtained by simply setting the robot waiting time if schedulable, and it is very computationally efficient. The obtained schedule is shown to be optimal. Illustrative examples are given to demonstrate the proposed approach.

  18. Agent-Oriented Methodology and Modeling Tools%面向主体的开发方法和可视化建模工具

    Institute of Scientific and Technical Information of China (English)

    季强

    2002-01-01

    This paper introduces an agent-oriented methodology and modeling tools based on MAGE. The methodology supports analysis, desing and implimentation of multi-agent systems. The modeling tools assist the developer in building multi-agent systems using the methodology through a set of visual model editors.

  19. Mathematical modelling: a tool for hospital infection control.

    OpenAIRE

    Grundmann, Hajo; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has emerged that provides solid and testable hypotheses and opens the road to a quantitative assessment of the main obstructions that undermine current efforts to control the spread of health-care-associate...

  20. Tav4SB: integrating tools for analysis of kinetic models of biological systems

    Directory of Open Access Journals (Sweden)

    Rybiński Mikołaj

    2012-04-01

    Full Text Available Abstract Background Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. Results The Taverna services for Systems Biology (Tav4SB project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project’s Web page: http://bioputer.mimuw.edu.pl/tav4sb/. Conclusions The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.