WorldWideScience

Sample records for modelling tool websim-milq

  1. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. I

  2. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs.

  3. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. I

  4. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry.

    Science.gov (United States)

    Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P

    2008-11-30

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.

  5. Population Density Modeling Tool

    Science.gov (United States)

    2014-02-05

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...Density Modeling Tool 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Davy Andrew Michael Knott David Burke 5d. PROJECT NUMBER

  6. Graphical Modeling Language Tool

    NARCIS (Netherlands)

    Rumnit, M.

    2003-01-01

    The group of the faculty EE-Math-CS of the University of Twente is developing a graphical modeling language for specifying concurrency in software design. This graphical modeling language has a mathematical background based on the theorie of CSP. This language contains the power to create trustworth

  7. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1998-01-01

    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  8. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Numerous design decisions are made while developing software systems, which influence the architecture of these systems as well as following decisions. A number of decision management tools already exist for capturing, documenting, and maintaining design decisions, but also for guiding developers...... the development process. In this report, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from a case study: the modeling tool shall show all decisions related to a model and allow its users to extend or update them; the decision management tool shall...... trigger the modeling tool to realize design decisions in the models. We define tool-independent concepts and architecture building blocks supporting these use cases and present how they can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless...

  9. Modeling Tool Advances Rotorcraft Design

    Science.gov (United States)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  10. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  11. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  12. Fire behavior modeling-a decision tool

    Science.gov (United States)

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  13. Spatial Modeling Tools for Cell Biology

    Science.gov (United States)

    2006-10-01

    34 iv Figure 5.1: Computational results for a diffusion problem on planar square thin film............ 36 Figure 5.2... Wisc . Open Microscopy Env. Pre-CoBi Model Lib. CFDRC CoBi Tools CFDRC CoBi Tools Simulation Environment JigCell Tools Figure 4.1: Cell biology

  14. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  15. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define...... a taxonomy of aspects around conservation, constraints and constitutive relations. Aspects of the ICAS-MoT toolbox are given to illustrate the functionality of a computer aided modelling tool, which incorporates an interface to MS Excel....

  16. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  17. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  18. Simulation Tool for Inventory Models: SIMIN

    OpenAIRE

    Pratiksha Saxen; Tulsi Kushwaha

    2014-01-01

    In this paper, an integrated simulation optimization model for the inventory system is developed. An effective algorithm is developed to evaluate and analyze the back-end stored simulation results. This paper proposes simulation tool SIMIN (Inventory Simulation) to simulate inventory models. SIMIN is a tool which simulates and compares the results of different inventory models. To overcome various practical restrictive assumptions, SIMIN provides values for a number of performance measurement...

  19. ANSYS tools in modeling tires

    Science.gov (United States)

    Ali, Ashraf; Lovell, Michael

    1995-08-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  20. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels......, and it is investigated how a non-uniform airflow influences the refrigerant mass flow rate distribution and the total cooling capacity of the heat exchanger. It is shown that the cooling capacity decreases significantly with increasing maldistribution of the airflow. Comparing the two simulation tools it is found...

  1. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  2. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  3. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland wate

  4. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh;

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  5. TOOL FORCE MODEL FOR DIAMOND TURNING

    Institute of Scientific and Technical Information of China (English)

    Wang Hongxiang; Sun Tao; Li Dan; Dong Shen

    2004-01-01

    A new tool force model to be presented is based upon process geometry and the characteristics of the force system,in which the forces acting on the tool rake face,the cutting edge rounding and the clearance face have been considered,and the size effect is accountable for the new model.It is desired that the model can be well applicable to conventional diamond turning and the model may be employed as a tool in the design of diamond tools.This approach is quite different from traditional investigations primarily based on empirical studies.As the depth of cut becomes the same order as the rounded cutting edge radius,sliding along the clearance face due to elastic recovery of workpiece material and plowing due to the rounded cutting edge may become important in micro-machining,the forces acting on the cutting edge rounding and the clearance face can not be neglected.For this reason,it is very important to understand the influence of some parameters on tool forces and develop a model of the relationship between them.

  6. A review of electricity market modelling tools

    Directory of Open Access Journals (Sweden)

    Sandra Milena Londoño Hernández

    2010-05-01

    Full Text Available Deregulating electricity markets around the world in the search for efficiency has introduced competition into the electricity marke- ting and generation business. Studying interactions amongst the participants has thus acquired great importance for regulators and market participants for analysing market evolution and suitably defining their bidding strategies. Different tools have thereof- re been used for modelling competitive electricity markets during the last few years. This paper presents an analytical review of the bibliography found regarding this subject; it also presents the most used tools along with their advantages and disadvantages. Such analysis was done by comparing the models used, identifying the main market characteristics such as market structure, bid structure and kind of bidding. This analysis concluded that the kind of tool to be used mainly depends on a particular study’s goal and scope.

  7. HYDROLOGICAL PROCESSES MODELLING USING ADVANCED HYDROINFORMATIC TOOLS

    Directory of Open Access Journals (Sweden)

    BEILICCI ERIKA

    2014-03-01

    Full Text Available The water has an essential role in the functioning of ecosystems by integrating the complex physical, chemical, and biological processes that sustain life. Water is a key factor in determining the productivity of ecosystems, biodiversity and species composition. Water is also essential for humanity: water supply systems for population, agriculture, fisheries, industries, and hydroelectric power depend on water supplies. The modelling of hydrological processes is an important activity for water resources management, especially now, when the climate change is one of the major challenges of our century, with strong influence on hydrological processes dynamics. Climate change and needs for more knowledge in water resources require the use of advanced hydroinformatic tools in hydrological processes modelling. The rationale and purpose of advanced hydroinformatic tools is to develop a new relationship between the stakeholders and the users and suppliers of the systems: to offer the basis (systems which supply useable results, the validity of which cannot be put in reasonable doubt by any of the stakeholders involved. For a successful modelling of hydrological processes also need specialists well trained and able to use advanced hydro-informatics tools. Results of modelling can be a useful tool for decision makers to taking efficient measures in social, economical and ecological domain regarding water resources, for an integrated water resources management.

  8. An analytical model for resistivity tools

    Energy Technology Data Exchange (ETDEWEB)

    Hovgaard, J.

    1991-04-01

    An analytical model for resistivity tools is developed. It takes into account the effect of the borehole and the actual shape of the electrodes. The model is two-dimensional, i.e. the model does not deal with eccentricity. The electrical potential around a current source satisfies Poisson`s equation. The method used here to solve Poisson`s equation is the expansion fo the potential function in terms of a complete set of functions involving one of the coordinates with coefficients which are undetermined functions of the other coordinate. Numerical examples of the use of the model are presented. The results are compared with results given in the literature. (au).

  9. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    . It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained...

  10. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  11. A tool box for implementing supersymmetric models

    Science.gov (United States)

    Staub, Florian; Ohl, Thorsten; Porod, Werner; Speckner, Christian

    2012-10-01

    We present a framework for performing a comprehensive analysis of a large class of supersymmetric models, including spectrum calculation, dark matter studies and collider phenomenology. To this end, the respective model is defined in an easy and straightforward way using the Mathematica package SARAH. SARAH then generates model files for CalcHep which can be used with micrOMEGAs as well as model files for WHIZARD and O'Mega. In addition, Fortran source code for SPheno is created which facilitates the determination of the particle spectrum using two-loop renormalization group equations and one-loop corrections to the masses. As an additional feature, the generated SPheno code can write out input files suitable for use with HiggsBounds to apply bounds coming from the Higgs searches to the model. Combining all programs provides a closed chain from model building to phenomenology. Program summary Program title: SUSY Phenomenology toolbox. Catalog identifier: AEMN_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMN_v1_0.html. Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html. No. of lines in distributed program, including test data, etc.: 140206. No. of bytes in distributed program, including test data, etc.: 1319681. Distribution format: tar.gz. Programming language: Autoconf, Mathematica. Computer: PC running Linux, Mac. Operating system: Linux, Mac OS. Classification: 11.6. Nature of problem: Comprehensive studies of supersymmetric models beyond the MSSM is considerably complicated by the number of different tasks that have to be accomplished, including the calculation of the mass spectrum and the implementation of the model into tools for performing collider studies, calculating the dark matter density and checking the compatibility with existing collider bounds (in particular, from the Higgs searches). Solution method: The

  12. GridTool: A surface modeling and grid generation tool

    Science.gov (United States)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  13. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained. It is fo......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained....... It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected...

  14. An MCMC Circumstellar Disks Modeling Tool

    Science.gov (United States)

    Wolff, Schuyler; Perrin, Marshall D.; Mazoyer, Johan; Choquet, Elodie; Soummer, Remi; Ren, Bin; Pueyo, Laurent; Debes, John H.; Duchene, Gaspard; Pinte, Christophe; Menard, Francois

    2016-01-01

    We present an enhanced software framework for the Monte Carlo Markov Chain modeling of circumstellar disk observations, including spectral energy distributions and multi wavelength images from a variety of instruments (e.g. GPI, NICI, HST, WFIRST). The goal is to self-consistently and simultaneously fit a wide variety of observables in order to place constraints on the physical properties of a given disk, while also rigorously assessing the uncertainties in the derived properties. This modular code is designed to work with a collection of existing modeling tools, ranging from simple scripts to define the geometry for optically thin debris disks, to full radiative transfer modeling of complex grain structures in protoplanetary disks (using the MCFOST radiative transfer modeling code). The MCMC chain relies on direct chi squared comparison of model images/spectra to observations. We will include a discussion of how best to weight different observations in the modeling of a single disk and how to incorporate forward modeling from PCA PSF subtraction techniques. The code is open source, python, and available from github. Results for several disks at various evolutionary stages will be discussed.

  15. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  16. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  17. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  18. A pandemic influenza modeling and visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Maciejewski, Ross; Livengood, Philip; Rudolph, Stephen; Collins, Timothy F.; Ebert, David S.; Brigantic, Robert T.; Corley, Courtney D.; Muller, George A.; Sanders, Stephen W.

    2011-08-01

    The National Strategy for Pandemic Influenza outlines a plan for community response to a potential pandemic. In this outline, state and local communities are charged with enhancing their preparedness. In order to help public health officials better understand these charges, we have developed a modeling and visualization toolkit (PanViz) for analyzing the effect of decision measures implemented during a simulated pandemic influenza scenario. Spread vectors based on the point of origin and distance traveled over time are calculated and the factors of age distribution and population density are taken into effect. Healthcare officials are able to explore the effects of the pandemic on the population through a spatiotemporal view, moving forward and backward through time and inserting decision points at various days to determine the impact. Linked statistical displays are also shown, providing county level summaries of data in terms of the number of sick, hospitalized and dead as a result of the outbreak. Currently, this tool has been deployed in Indiana State Department of Health planning and preparedness exercises, and as an educational tool for demonstrating the impact of social distancing strategies during the recent H1N1 (swine flu) outbreak.

  19. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  20. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  1. DATA QUALITY TOOLS FOR DATAWAREHOUSE MODELS

    Directory of Open Access Journals (Sweden)

    JASPREETI SINGH

    2015-05-01

    Full Text Available Data quality tools aim at detecting and correcting data problems that influence the accuracy and efficiency of data analysis applications. Data warehousing activities require data quality tools to ready the data and ensure that clean data populates the warehouse, thus raising usability of the warehouse. This research targets on the problems in the data that are addressed by data quality tools. We classify data quality tools based on datawarehouse stages and features of tool; which address the data quality problems and understand their functionalities.

  2. General model for boring tool optimization

    Science.gov (United States)

    Moraru, G. M.; rbes, M. V. Ze; Popescu, L. G.

    2016-08-01

    Optimizing a tool (and therefore those for boring) consist in improving its performance through maximizing the objective functions chosen by the designer and/or by user. In order to define and to implement the proposed objective functions, contribute numerous features and performance required by tool users. Incorporation of new features makes the cutting tool to be competitive in the market and to meet user requirements.

  3. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources......, are the needed models for such a framework available? Or, are modelling tools that can help to develop the needed models available? Can such a model-based framework provide the needed model-based work-flows matching the requirements of the specific chemical product-process design problems? What types of models...

  4. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  5. Modeling and Simulation Tools for Heavy Lift Airships

    Science.gov (United States)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  6. Tool for physics beyond the standard model

    Science.gov (United States)

    Newby, Christopher A.

    The standard model (SM) of particle physics is a well studied theory, but there are hints that the SM is not the final story. What the full picture is, no one knows, but this thesis looks into three methods useful for exploring a few of the possibilities. To begin I present a paper by Spencer Chang, Nirmal Raj, Chaowaroj Wanotayaroj, and me, that studies the Higgs boson. The scalar particle first seen in 2012 may be the vanilla SM version, but there is some evidence that its couplings are different than predicted. By means of increasing the Higgs' coupling to vector bosons and fermions, we can be more consistent with the data. Next, in a paper by Spencer Chang, Gabriel Barello, and me, we elaborate on a tool created to study dark matter (DM) direct detection. The original work by Anand. et al. focused on elastic dark matter, whereas we extended this work to include the in elastic case, where different DM mass states enter and leave the collision. We also examine several direct detection experiments with our new framework to see if DAMA's modulation can be explained while avoiding the strong constraints imposed by the other experiments. We find that there are several operators that can do this. Finally, in a paper by Spencer Chang, Gabriel Barello, and me, we study an interesting phenomenon know as kinetic mixing, where two gauge bosons can share interactions with particles even though these particles aren't charged under both gauge groups. This, in and of itself, is not new, but we discuss a different method of obtaining this mixing where instead of mixing between two Abelian groups one of the groups is Nonabelian. Using this we then see that there is an inherent mass scale in the mixing strength; something that is absent in the Abelian-Abelian case. Furthermore, if the Nonabelian symmetry is the SU(2)L of the SM then the mass scale of the physics responsible for the mixing is about 1 TeV, right around the sweet spot for detection at the LHC. This dissertation

  7. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper;

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  8. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. T

  9. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  10. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  11. Modeling Languages: metrics and assessing tools

    OpenAIRE

    Fonte, Daniela; Boas, Ismael Vilas; Azevedo, José; Peixoto, José João; Faria, Pedro; Silva, Pedro; Sá, Tiago de, 1990-; Costa, Ulisses; da Cruz, Daniela; Henriques, Pedro Rangel

    2012-01-01

    Any traditional engineering field has metrics to rigorously assess the quality of their products. Engineers know that the output must satisfy the requirements, must comply with the production and market rules, and must be competitive. Professionals in the new field of software engineering started a few years ago to define metrics to appraise their product: individual programs and software systems. This concern motivates the need to assess not only the outcome but also the process and tools em...

  12. The mathematical and computer modeling of the worm tool shaping

    Science.gov (United States)

    Panchuk, K. L.; Lyashkov, A. A.; Ayusheev, T. V.

    2017-06-01

    Traditionally mathematical profiling of the worm tool is carried out on the first T. Olivier method, known in the theory of gear gearings, with receiving an intermediate surface of the making lath. It complicates process of profiling and its realization by means of computer 3D-modeling. The purpose of the work is the improvement of mathematical model of profiling and its realization based on the methods of 3D-modeling. Research problems are: receiving of the mathematical model of profiling which excludes the presence of the making lath in it; realization of the received model by means of frame and superficial modeling; development and approbation of technology of solid-state modeling for the solution of the problem of profiling. As the basic, the kinematic method of research of the mutually envelope surfaces is accepted. Computer research is executed by means of CAD based on the methods of 3D-modeling. We have developed mathematical model of profiling of the worm tool; frame, superficial and solid-state models of shaping of the mutually enveloping surfaces of the detail and the tool are received. The offered mathematical models and the technologies of 3D-modeling of shaping represent tools for theoretical and experimental profiling of the worm tool. The results of researches can be used at design of metal-cutting tools.

  13. Many-Task Computing Tools for Multiscale Modeling

    OpenAIRE

    Katz, Daniel S.; Ripeanu, Matei; Wilde, Michael

    2011-01-01

    This paper discusses the use of many-task computing tools for multiscale modeling. It defines multiscale modeling and places different examples of it on a coupling spectrum, discusses the Swift parallel scripting language, describes three multiscale modeling applications that could use Swift, and then talks about how the Swift model is being extended to cover more of the multiscale modeling coupling spectrum.

  14. Scratch as a computational modelling tool for teaching physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  15. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  16. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    -friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend and/or adopt a model. This is based on the idea of model reuse, which emphasizes the use...... and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer...... aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation; 4) model transfer – export/import to/from other application for further extension and application – several types of formats, such as XML...

  17. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  18. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  19. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  20. XLISP-Stat Tools for Building Generalised Estimating Equation Models

    Directory of Open Access Journals (Sweden)

    Thomas Lumley

    1996-12-01

    Full Text Available This paper describes a set of Lisp-Stat tools for building Generalised Estimating Equation models to analyse longitudinal or clustered measurements. The user interface is based on the built-in regression and generalised linear model prototypes, with the addition of object-based error functions, correlation structures and model formula tools. Residual and deletion diagnostic plots are available on the cluster and observation level and use the dynamic graphics capabilities of Lisp-Stat.

  1. The Ising model as a pedagogical tool

    Science.gov (United States)

    Smith, Ryan; Hart, Gus L. W.

    2010-10-01

    Though originally developed to analyze ferromagnetic systems, the Ising model also provides an excellent framework for modeling alloys. The original Ising model represented magnetic moments (up or down) by a +1 or -1 at each point on a lattice and allowed only nearest neighbors interactions to be non-zero. In alloy modeling, the values ±1 represent A and B atoms. The Ising Hamiltonian can be used in a Monte Carlo approach to simulate the thermodynamics of the system (e.g., an order-disorder transition occuring as the temperature is lowered). The simplicity of the model makes it an ideal starting point for a qualitative understanding of magnetism or configuration ordering in a metal. I will demonstrate the application of the Ising model in simple, two-dimensional ferromagnetic systems and alloys.

  2. A Components Library System Model and the Support Tool

    Institute of Scientific and Technical Information of China (English)

    MIAO Huai-kou; LIU Hui; LIU Jing; LI Xiao-bo

    2004-01-01

    Component-based development needs a well-designed components library and a set of support tools.This paper presents the design and implementation of a components library system model and its support tool UMLCASE.A set of practical CASE tools is constructed.UMLCASE can use UML to design Use Case Diagram, Class Diagram etc.And it integrates with components library system.

  3. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sourc

  4. Techniques and tools for efficiently modeling multiprocessor systems

    Science.gov (United States)

    Carpenter, T.; Yalamanchili, S.

    1990-01-01

    System-level tools and methodologies associated with an integrated approach to the development of multiprocessor systems are examined. Tools for capturing initial program structure, automated program partitioning, automated resource allocation, and high-level modeling of the combined application and resource are discussed. The primary language focus of the current implementation is Ada, although the techniques should be appropriate for other programming paradigms.

  5. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  6. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  7. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an estimat

  8. Model atmospheres - Tool for identifying interstellar features

    Science.gov (United States)

    Frisch, P. C.; Slojkowski, S. E.; Rodriguez-Bell, T.; York, D.

    1993-01-01

    Model atmosphere parameters are derived for 14 early A stars with rotation velocities, from optical spectra, in excess of 80 km/s. The models are compared with IUE observations of the stars in regions where interstellar lines are expected. In general, with the assumption of solar abundances, excellent fits are obtained in regions longward of 2580 A, and accurate interstellar equivalent widths can be derived using models to establish the continuum. The fits are poorer at shorter wavelengths, particularly at 2026-2062 A, where the stellar model parameters seem inadequate. Features indicating mass flows are evident in stars with known infrared excesses. In gamma TrA, variability in the Mg II lines is seen over the 5-year interval of these data, and also over timescales as short as 26 days. The present technique should be useful in systematic studies of episodic mass flows in A stars and for stellar abundance studies, as well as interstellar features.

  9. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  10. Multidisciplinary Modelling Tools for Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad

    This thesis presents multidisciplinary modelling techniques in a Design For Reliability (DFR) approach for power electronic circuits. With increasing penetration of renewable energy systems, the demand for reliable power conversion systems is becoming critical. Since a large part of electricity...... in reliability assessment of power modules, a three-dimensional lumped thermal network is proposed to be used for fast, accurate and detailed temperature estimation of power module in dynamic operation and different boundary conditions. Since an important issue in the reliability of power electronics...... are generic and valid to be used in circuit simulators or any programing software. These models are important building blocks for the reliable design process or performance assessment of power electronic circuits. The models can save time and cost in power electronics packaging and power converter to evaluate...

  11. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  12. A community diagnostic tool for chemistry climate model validation

    Directory of Open Access Journals (Sweden)

    A. Gettelman

    2012-09-01

    Full Text Available This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model, and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

  13. Modular target acquisition model & visualization tool

    NARCIS (Netherlands)

    Bijl, P.; Hogervorst, M.A.; Vos, W.K.

    2008-01-01

    We developed a software framework for image-based simulation models in the chain: scene-atmosphere-sensor-image enhancement-display-human observer: EO-VISTA. The goal is to visualize the steps and to quantify (Target Acquisition) task performance. EO-VISTA provides an excellent means to systematical

  14. Student Model Tools Code Release and Documentation

    DEFF Research Database (Denmark)

    Johnson, Matthew; Bull, Susan; Masci, Drew

    This document contains a wealth of information about the design and implementation of the Next-TELL open learner model. Information is included about the final specification (Section 3), the interfaces and features (Section 4), its implementation and technical design (Section 5) and also a summary...

  15. Fluid Survival Tool: A Model Checker for Hybrid Petri Nets

    NARCIS (Netherlands)

    Postema, Björn; Remke, Anne; Haverkort, Boudewijn R.; Ghasemieh, Hamed

    2014-01-01

    Recently, algorithms for model checking Stochastic Time Logic (STL) on Hybrid Petri nets with a single general one-shot transition (HPNG) have been introduced. This paper presents a tool for model checking HPNG models against STL formulas. A graphical user interface (GUI) not only helps to demonstra

  16. Engineering tools for robust creep modelling

    OpenAIRE

    Holmström, Stefan

    2010-01-01

    High temperature creep is often dealt with simplified models to assess and predict the future behavior of materials and components. Also, for most applications the creep properties of interest require costly long-term testing that limits the available data to support design and life assessment. Such test data sets are even smaller for welded joints that are often the weakest links of structures. It is of considerable interest to be able to reliably predict and extrapolate long term creep beha...

  17. Theme E: disabilities: analysis models and tools

    OpenAIRE

    Vigouroux, Nadine; Gorce, Philippe; Roby-Brami, Agnès; Rémi-Néris, Olivier

    2013-01-01

    International audience; This paper presents the topics and the activity of the theme E “disabilities: analysis models and tools” within the GDR STIC Santé. This group has organized a conference and a workshop during the period 2011–2012. The conference has focused on technologies for cognitive, sensory and motor impairments, assessment and use study of assistive technologies, user centered method design and the place of ethics in these research topics. The objective of “bodily integration of ...

  18. Constructing an advanced software tool for planetary atmospheric modeling

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  19. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  20. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  1. Rasp Tool on Phoenix Robotic Arm Model

    Science.gov (United States)

    2008-01-01

    This close-up photograph taken at the Payload Interoperability Testbed at the University of Arizona, Tucson, shows the motorized rasp protruding from the bottom of the scoop on the engineering model of NASA's Phoenix Mars Lander's Robotic Arm. The rasp will be placed against the hard Martian surface to cut into the hard material and acquire an icy soil sample for analysis by Phoenix's scientific instruments. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  2. A Hybrid Tool for User Interface Modeling and Prototyping

    Science.gov (United States)

    Trætteberg, Hallvard

    Although many methods have been proposed, model-based development methods have only to some extent been adopted for UI design. In particular, they are not easy to combine with user-centered design methods. In this paper, we present a hybrid UI modeling and GUI prototyping tool, which is designed to fit better with IS development and UI design traditions. The tool includes a diagram editor for domain and UI models and an execution engine that integrates UI behavior, live UI components and sample data. Thus, both model-based user interface design and prototyping-based iterative design are supported

  3. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  4. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......’s preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation...

  5. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  6. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  7. A Suite of Tools for ROC Analysis of Spatial Models

    Directory of Open Access Journals (Sweden)

    Hermann Rodrigues

    2013-09-01

    Full Text Available The Receiver Operating Characteristic (ROC is widely used for assessing the performance of classification algorithms. In GIScience, ROC has been applied to assess models aimed at predicting events, such as land use/cover change (LUCC, species distribution and disease risk. However, GIS software packages offer few statistical tests and guidance tools for ROC analysis and interpretation. This paper presents a suite of GIS tools designed to facilitate ROC curve analysis for GIS users by applying proper statistical tests and analysis procedures. The tools are freely available as models and submodels of Dinamica EGO freeware. The tools give the ROC curve, the area under the curve (AUC, partial AUC, lower and upper AUCs, the confidence interval of AUC, the density of event in probability bins and tests to evaluate the difference between the AUCs of two models. We present first the procedures and statistical tests implemented in Dinamica EGO, then the application of the tools to assess LUCC and species distribution models. Finally, we interpret and discuss the ROC-related statistics resulting from various case studies.

  8. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  9. Risk Assessment in Fractured Clayey Tills - Which Modeling Tools?

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Bjerg, Poul Løgstrup; Binning, Philip John

    2012-01-01

    assessment is challenging and the inclusion of the relevant processes is difficult. Furthermore the lack of long-term monitoring data prevents from verifying the accuracy of the different conceptual models. Further investigations based on long-term data and numerical modeling are needed to accurately......The article presents different tools available for risk assessment in fractured clayey tills and their advantages and limitations are discussed. Because of the complex processes occurring during contaminant transport through fractured media, the development of simple practical tools for risk...

  10. Homology modeling: an important tool for the drug discovery.

    Science.gov (United States)

    França, Tanos Celmar Costa

    2015-01-01

    In the last decades, homology modeling has become a popular tool to access theoretical three-dimensional (3D) structures of molecular targets. So far several 3D models of proteins have been built by this technique and used in a great diversity of structural biology studies. But are those models consistent enough with experimental structures to make this technique an effective and reliable tool for drug discovery? Here we present, briefly, the fundamentals and current state-of-the-art of the homology modeling techniques used to build 3D structures of molecular targets, which experimental structures are not available in databases, and list some of the more important works, using this technique, available in literature today. In many cases those studies have afforded successful models for the drug design of more selective agonists/antagonists to the molecular targets in focus and guided promising experimental works, proving that, when the appropriate templates are available, useful models can be built using some of the several software available today for this purpose. Limitations of the experimental techniques used to solve 3D structures allied to constant improvements in the homology modeling software will maintain the need for theoretical models, establishing the homology modeling as a fundamental tool for the drug discovery.

  11. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  12. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    to enforce design decisions (modify the models). We define tool-independent concepts and architecture building blocks supporting these requirements and present first ideas how this can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless integration......Numerous design decisions including architectural decisions are made while developing a software system, which influence the architecture of the system as well as subsequent decisions. Several tools already exist for managing design decisions, i.e. capturing, documenting, and maintaining them......, but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...

  13. A general thermal model of machine tool spindle

    Directory of Open Access Journals (Sweden)

    Yanfang Dong

    2017-01-01

    Full Text Available As the core component of machine tool, the thermal characteristics of the spindle have a significant influence on machine tool running status. Lack of an accurate model of the spindle system, particularly the model of load–deformation coefficient between the bearing rolling elements and rings, severely limits the thermal error analytic precision of the spindle. In this article, bearing internal loads, especially the function relationships between the principal curvature difference F(ρ and auxiliary parameter nδ, semi-major axis a, and semi-minor axis b, have been determined; furthermore, high-precision heat generation combining the heat sinks in the spindle system is calculated; finally, an accurate thermal model of the spindle was established. Moreover, a conventional spindle with embedded fiber Bragg grating temperature sensors has been developed. By comparing the experiment results with simulation, it indicates that the model has good accuracy, which verifies the reliability of the modeling process.

  14. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  15. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    , Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...... signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, "Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems" for the period 2005 to 2007. The main activity is semi-annual...... workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops...

  16. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  17. Integrated landscape/hydrologic modeling tool for semiarid watersheds

    Science.gov (United States)

    Mariano Hernandez; Scott N. Miller

    2000-01-01

    An integrated hydrologic modeling/watershed assessment tool is being developed to aid in determining the susceptibility of semiarid landscapes to natural and human-induced changes across a range of scales. Watershed processes are by definition spatially distributed and are highly variable through time, and this approach is designed to account for their spatial and...

  18. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  19. Simulation modeling: a powerful tool for process improvement.

    Science.gov (United States)

    Boxerman, S B

    1996-01-01

    Simulation modeling provides an efficient means of examining the operation of a system under a variety of alternative conditions. This tool can potentially enhance a benchmarking project by providing a means for evaluating proposed modifications to the system or process under study.

  20. Designing a Training Tool for Imaging Mental Models

    Science.gov (United States)

    1990-11-01

    about how to weave together their disparate fields into a seamless web of knowledge . Learners often cannot visualize how the concepts and skills they...a seamless web of knowledge ? " How does the availability of a mental modeling tool enhance the ability of instructional designers to prepare

  1. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  2. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    Science.gov (United States)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  3. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  4. Tool Steel Heat Treatment Optimization Using Neural Network Modeling

    Science.gov (United States)

    Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz

    2016-11-01

    Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.

  5. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  6. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  7. Greenhouse gases from wastewater treatment - A review of modelling tools.

    Science.gov (United States)

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced.

  8. AgMIP Training in Multiple Crop Models and Tools

    Science.gov (United States)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  9. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... of the framework. The issue of commercial simulators or software providing the necessary features for product-process synthesis-design as opposed to their development by the academic PSE community will also be discussed. An example of a successful collaboration between academia-industry for the development...

  10. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Science.gov (United States)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  11. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D; Halford, Keith J.; Binley, Andrew; Lane, John; Werkema, Dale

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  12. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  13. Evaluating EML Modeling Tools for Insurance Purposes: A Case Study

    Directory of Open Access Journals (Sweden)

    Mikael Gustavsson

    2010-01-01

    Full Text Available As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

  14. Nephrectomized and hepatectomized animal models as tools in preclinical pharmacokinetics.

    Science.gov (United States)

    Vestergaard, Bill; Agersø, Henrik; Lykkesfeldt, Jens

    2013-08-01

    Early understanding of the pharmacokinetics and metabolic patterns of new drug candidates is essential for selection of optimal candidates to move further in to the drug development process. In vitro methodologies can be used to investigate metabolic patterns, but in general, they lack several aspects of the whole-body physiology. In contrast, the complexity of intact animals does not necessarily allow individual processes to be identified. Animal models lacking a major excretion organ can be used to investigate these individual metabolic processes. Animal models of nephrectomy and hepatectomy have considerable potential as tools in preclinical pharmacokinetics to assess organs of importance for drug clearance and thereby knowledge of potential metabolic processes to manipulate to improve pharmacokinetic properties of the molecules. Detailed knowledge of anatomy and surgical techniques is crucial to successfully establish the models, and a well-balanced anaesthesia and adequate monitoring of the animals are also of major importance. An obvious drawback of animal models lacking an organ is the disruption of normal homoeostasis and the induction of dramatic and ultimately mortal systemic changes in the animals. Refining of the surgical techniques and the post-operative supportive care of the animals can increase the value of these models by minimizing the systemic changes induced, and thorough validation of nephrectomy and hepatectomy models is needed before use of such models as a tool in preclinical pharmacokinetics. The present MiniReview discusses pros and cons of the available techniques associated with establishing nephrectomy and hepatectomy models.

  15. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... strategies have different goals e.g. fast response over disturbances, optimum power efficiency over a wider range of wind speeds, voltage ride-through capability including grid support. A dynamic model of a DC connection for active stall wind farms to the grid including the control is also implemented...

  16. Hypermedia as an experiential learning tool: a theoretical model

    Directory of Open Access Journals (Sweden)

    Jose Miguel Baptista Nunes

    1996-01-01

    Full Text Available The process of methodical design and development is of extreme importance in the production of educational software. However, this process will only be effective, if it is based on a theoretical model that explicitly defines what educational approach is being used and how specific features of the technology can best support it. This paper proposes a theoretical model of how hypermedia can be used as an experiential learning tool. The development of the model was based on a experiential learning approach and simultaneously aims at minimising the inherent problems of hypermedia as the underlying support technology.

  17. Neural Networks for Hydrological Modeling Tool for Operational Purposes

    Science.gov (United States)

    Bhatt, Divya; Jain, Ashu

    2010-05-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models

  18. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    Science.gov (United States)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  19. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  20. Application of Process Modeling Tools to Ship Design

    Science.gov (United States)

    2011-05-01

    Release; Distribution is unlimited. Different People – Different Preferences • We need to view process data in multiple formats. – DSM – GANTT Charts...CertificatePrograms/tRI Scheduling Software Spreadsheet Software Info Modeling Software DSM Tool Schedules IDEF Diagrams Spreadsheets DSM Schema 6/2...Chart Gantt Chart DSM Flow Chart by Geography 6/2/2011 3:41 PM 16Statement A: Approved for Public Release; Distribution is unlimited. Multi-domain

  1. Schistosomiasis japonica: modelling as a tool to explore transmission patterns.

    Science.gov (United States)

    Xu, Jun-Fang; Lv, Shan; Wang, Qing-Yun; Qian, Men-Bao; Liu, Qin; Bergquist, Robert; Zhou, Xiao-Nong

    2015-01-01

    Modelling is an important tool for the exploration of Schistosoma japonicum transmission patterns. It provides a general theoretical framework for decision-makers and lends itself specifically to assessing the progress of the national control programme by following the outcome of surveys. The challenge of keeping up with the many changes of social, ecological and environmental factors involved in control activities is greatly facilitated by modelling that can also indicate which activities are critical and which are less important. This review examines the application of modelling tools in the epidemiological study of schistosomiasis japonica during the last 20 years and explores the application of enhanced models for surveillance and response. Updated and timely information for decision-makers in the national elimination programme is provided but, in spite of the new modelling techniques introduced, many questions remain. Issues on application of modelling are discussed with the view to improve the current situation with respect to schistosomiasis japonica. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    Science.gov (United States)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  3. Virtual Sensor for Calibration of Thermal Models of Machine Tools

    Directory of Open Access Journals (Sweden)

    Alexander Dementjev

    2014-01-01

    strictly depends on the accuracy of these machines, but they are prone to deformation caused by their own heat. The deformation needs to be compensated in order to assure accurate production. So an adequate model of the high-dimensional thermal deformation process must be created and parameters of this model must be evaluated. Unfortunately, such parameters are often unknown and cannot be calculated a priori. Parameter identification during real experiments is not an option for these models because of its high engineering and machine time effort. The installation of additional sensors to measure these parameters directly is uneconomical. Instead, an effective calibration of thermal models can be reached by combining real and virtual measurements on a machine tool during its real operation, without additional sensors installation. In this paper, a new approach for thermal model calibration is presented. The expected results are very promising and can be recommended as an effective solution for this class of problems.

  4. Evaluation of air pollution modelling tools as environmental engineering courseware.

    Science.gov (United States)

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results.

  5. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology......, analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...

  6. Error Model and Accuracy Calibration of 5-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Fangyu Pan

    2013-08-01

    Full Text Available To improve the machining precision and reduce the geometric errors for 5-axis machinetool, error model and calibration are presented in this paper. Error model is realized by the theory of multi-body system and characteristic matrixes, which can establish the relationship between the cutting tool and the workpiece in theory. The accuracy calibration was difficult to achieve, but by a laser approach-laser interferometer and laser tracker, the errors can be displayed accurately which is benefit for later compensation.

  7. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  8. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa (κ): 0

  9. A Tool for Sharing Empirical Models of Climate Impacts

    Science.gov (United States)

    Rising, J.; Kopp, R. E.; Hsiang, S. M.

    2013-12-01

    Scientists, policy advisors, and the public struggle to synthesize the quickly evolving empirical work on climate change impacts. The Integrated Assessment Models (IAMs) used to estimate the impacts of climate change and the effects of adaptation and mitigation policies can also benefit greatly from recent empirical results (Kopp, Hsiang & Oppenheimer, Impacts World 2013 discussion paper). This paper details a new online tool for exploring, analyzing, combining, and communicating a wide range of impact results, and supporting their integration into IAMs. The tool uses a new database of statistical results, which researchers can expand both in depth (by providing additional results that describing existing relationships) and breadth (by adding new relationships). Scientists can use the tool to quickly perform meta-analyses of related results, using Bayesian techniques to produce pooled and partially-pooled posterior distributions. Policy advisors can apply the statistical results to particular contexts, and combine different kinds of results in a cost-benefit framework. For example, models of the impact of temperature changes on agricultural yields can be first aggregated to build a best-estimate of the effect under given assumptions, then compared across countries using different temperature scenarios, and finally combined to estimate a social cost of carbon. The general public can better understand the many estimates of climate impacts and their range of uncertainty by exploring these results dynamically, with maps, bar charts, and dose-response-style plots. Front page of the climate impacts tool website. Sample "collections" of models, within which all results are estimates of the same fundamental relationship, are shown on the right. Simple pooled result for Gelman's "8 schools" example. Pooled results are calculated analytically, while partial-pooling (Bayesian hierarchical estimation) uses posterior simulations.

  10. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    . In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling...

  11. Right approach to 3D modeling using CAD tools

    Science.gov (United States)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  12. Modeling as a tool for process control: alcoholic fermentation

    Energy Technology Data Exchange (ETDEWEB)

    Tayeb, A.M.; Ashour, I.A.; Mostafa, N.A. (El-Minia Univ. (EG). Faculty of Engineering)

    1991-01-01

    The results of the alcoholic fermentation of beet sugar molasses and wheat milling residues (Akalona) were fed into a computer program. Consequently, the kinetic parameters for these fermentation reactions were determined. These parameters were put into a kinetic model. Next, the model was tested, and the results obtained were compared with the experimental results of both beet molasses and Akalona. The deviation of the experimental results from the results obtained from the model was determined. An acceptable deviation of 1.2% for beet sugar molasses and 3.69% for Akalona was obtained. Thus, the present model could be a tool for chemical engineers working in fermentation processes both with respect to the control of the process and the design of the fermentor. (Author).

  13. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  14. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Science.gov (United States)

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  15. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  16. Laser melting of carbide tool surface: Model and experimental studies

    Energy Technology Data Exchange (ETDEWEB)

    Yilbas, B.S., E-mail: bsyilbas@kfupm.edu.sa [ME Department, King Fahd University of Petroleum and Minerals, KFUPM Box 1913, Dhahran 31261 (Saudi Arabia); Shuja, S.Z.; Khan, S.M.A.; Aleem, A. [ME Department, King Fahd University of Petroleum and Minerals, KFUPM Box 1913, Dhahran 31261 (Saudi Arabia)

    2009-09-15

    Laser controlled melting is one of the methods to achieve structural integrity in the surface region of the carbide tools. In the present study, laser heating of carbide cutting tool and temperature distribution in the irradiated region are examined. The phase change process during the heating is modeled using the enthalpy-porosity method. The influence of laser pulse intensity distribution across the irradiated surface ({beta}) on temperature distribution and melt formation is investigated. An experiment is carried out and the microstructural changes due to laser consecutive pulse heating is examined using the scanning electron microscope (SEM). It is found that melt depth predicted agrees with the experimental results. The maximum depth of the melt layer moves away from the symmetry axis with increasing {beta}.

  17. Laser melting of carbide tool surface: Model and experimental studies

    Science.gov (United States)

    Yilbas, B. S.; Shuja, S. Z.; Khan, S. M. A.; Aleem, A.

    2009-09-01

    Laser controlled melting is one of the methods to achieve structural integrity in the surface region of the carbide tools. In the present study, laser heating of carbide cutting tool and temperature distribution in the irradiated region are examined. The phase change process during the heating is modeled using the enthalpy-porosity method. The influence of laser pulse intensity distribution across the irradiated surface ( β) on temperature distribution and melt formation is investigated. An experiment is carried out and the microstructural changes due to laser consecutive pulse heating is examined using the scanning electron microscope (SEM). It is found that melt depth predicted agrees with the experimental results. The maximum depth of the melt layer moves away from the symmetry axis with increasing β.

  18. MGP : a tool for wide range temperature modelling

    Energy Technology Data Exchange (ETDEWEB)

    Morales, A.F. [Inst. Tecnologico Autonomo de Mexico, Mexico City (Mexico); Seisdedos, L.V. [Univ. de Oriente, Santiago de Cuba (Cuba). Dept. de Control Automatico

    2006-07-01

    This paper proposed a practical temperature modelling tool that used genetic multivariate polynomials to determine polynomial expressions of enthalpy and empirical heat transfer equations in superheaters. The model was designed to transform static parameter estimations from distributed into lumped parameter systems. Two dynamic regimes were explored: (1) a power dynamics regime containing major inputs and outputs needed for overall plant control; and (2) a steam temperature dynamics scheme that considered consecutive superheater sections considered in terms of cooling water mass flow and steam mass flow. The single lumped parameters model was developed to provide temperature control for a fossil fuel-fired power plant. The design procedure used enthalpy to determine the plant's energy balance. The enthalpy curve was seen as a function of either temperature and steam pressure. A graphic simulation tool was used to optimize the model by comparing real and simulated plant data. The study showed that the amount of energy taken by the steam mass flow per time unit can be calculated by measuring temperatures and pressures at both ends of the superheater. An algorithm was then developed to determine the polynomial's coefficients according to best curve fitting over the training set and best maximum errors. It was concluded that a unified approach is now being developed to simulate and emulate the dynamics of steam temperature for each section's attemporator-superheater. 14 refs., 3 tabs., 5 figs.

  19. Advanced Reach Tool (ART): development of the mechanistic model.

    Science.gov (United States)

    Fransman, Wouter; Van Tongeren, Martie; Cherrie, John W; Tischer, Martin; Schneider, Thomas; Schinkel, Jody; Kromhout, Hans; Warren, Nick; Goede, Henk; Tielemans, Erik

    2011-11-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. The ART mechanistic model is based on a conceptual framework that adopts a source receptor approach, which describes the transport of a contaminant from the source to the receptor and defines seven independent principal modifying factors: substance emission potential, activity emission potential, localized controls, segregation, personal enclosure, surface contamination, and dispersion. ART currently differentiates between three different exposure types: vapours, mists, and dust (fumes, fibres, and gases are presently excluded). Various sources were used to assign numerical values to the multipliers to each modifying factor. The evidence used to underpin this assessment procedure was based on chemical and physical laws. In addition, empirical data obtained from literature were used. Where this was not possible, expert elicitation was applied for the assessment procedure. Multipliers for all modifying factors were peer reviewed by leading experts from industry, research institutes, and public authorities across the globe. In addition, several workshops with experts were organized to discuss the proposed exposure multipliers. The mechanistic model is a central part of the ART tool and with advancing knowledge on exposure, determinants will require updates and refinements on a continuous basis, such as the effect of worker behaviour on personal exposure, 'best practice' values that describe the maximum achievable effectiveness of control measures, the intrinsic emission potential of various solid objects (e.g. metal, glass, plastics, etc.), and extending the applicability domain to certain types of exposures (e.g. gas, fume, and fibre exposure).

  20. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL......, the first transformation language implementing the principles of Transparent Model Transformation: syntax, environment, and execution transparency. VMTL works by weaving a transformation aspect into its host modeling language. We show how our implementation of VMTL turns any model editor into a flexible...

  1. The ADAPT Tool: From AADL Architectural Models to Stochastic Petri Nets through Model Transformation

    CERN Document Server

    Rugina, Ana E; Kaaniche, Mohamed

    2008-01-01

    ADAPT is a tool that aims at easing the task of evaluating dependability measures in the context of modern model driven engineering processes based on AADL (Architecture Analysis and Design Language). Hence, its input is an AADL architectural model annotated with dependability-related information. Its output is a dependability evaluation model in the form of a Generalized Stochastic Petri Net (GSPN). The latter can be processed by existing dependability evaluation tools, to compute quantitative measures such as reliability, availability, etc.. ADAPT interfaces OSATE (the Open Source AADL Tool Environment) on the AADL side and SURF-2, on the dependability evaluation side. In addition, ADAPT provides the GSPN in XML/XMI format, which represents a gateway to other dependability evaluation tools, as the processing techniques for XML files allow it to be easily converted to a tool-specific GSPN.

  2. Modeling in the Classroom: An Evolving Learning Tool

    Science.gov (United States)

    Few, A. A.; Marlino, M. R.; Low, R.

    2006-12-01

    Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage

  3. A Tool for Model-Based Language Specification

    CERN Document Server

    Quesada, Luis; Cubero, Juan-Carlos

    2011-01-01

    Formal languages let us define the textual representation of data with precision. Formal grammars, typically in the form of BNF-like productions, describe the language syntax, which is then annotated for syntax-directed translation and completed with semantic actions. When, apart from the textual representation of data, an explicit representation of the corresponding data structure is required, the language designer has to devise the mapping between the suitable data model and its proper language specification, and then develop the conversion procedure from the parse tree to the data model instance. Unfortunately, whenever the format of the textual representation has to be modified, changes have to propagated throughout the entire language processor tool chain. These updates are time-consuming, tedious, and error-prone. Besides, in case different applications use the same language, several copies of the same language specification have to be maintained. In this paper, we introduce a model-based parser generat...

  4. T:XML: A Tool Supporting User Interface Model Transformation

    Science.gov (United States)

    López-Jaquero, Víctor; Montero, Francisco; González, Pascual

    Model driven development of user interfaces is based on the transformation of an abstract specification into the final user interface the user will interact with. The design of transformation rules to carry out this transformation process is a key issue in any model-driven user interface development approach. In this paper, we introduce T:XML, an integrated development environment for managing, creating and previewing transformation rules. The tool supports the specification of transformation rules by using a graphical notation that works on the basis of the transformation of the input model into a graph-based representation. T:XML allows the design and execution of transformation rules in an integrated development environment. Furthermore, the designer can also preview how the generated user interface looks like after the transformations have been applied. These previewing capabilities can be used to quickly create prototypes to discuss with the users in user-centered design methods.

  5. Ontology-based tools to expedite predictive model construction.

    Science.gov (United States)

    Haug, Peter; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Ferraro, Jeffrey

    2014-01-01

    Large amounts of medical data are collected electronically during the course of caring for patients using modern medical information systems. This data presents an opportunity to develop clinically useful tools through data mining and observational research studies. However, the work necessary to make sense of this data and to integrate it into a research initiative can require substantial effort from medical experts as well as from experts in medical terminology, data extraction, and data analysis. This slows the process of medical research. To reduce the effort required for the construction of computable, diagnostic predictive models, we have developed a system that hybridizes a medical ontology with a large clinical data warehouse. Here we describe components of this system designed to automate the development of preliminary diagnostic models and to provide visual clues that can assist the researcher in planning for further analysis of the data behind these models.

  6. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  7. Fuzzy regression modeling for tool performance prediction and degradation detection.

    Science.gov (United States)

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  8. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  9. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  10. Numerical modelling of structure and mechanical properties for medical tools

    Directory of Open Access Journals (Sweden)

    L. Jeziorski

    2007-09-01

    , mechanical properties and work condition of medical tools. For bowl cutter was improve geometrical sharp and distribution of cutting holes. Originality/value: This paper presents conception to obtain new medical tools, after optimizing the basic construction parameters by numerical calculations. The prepared model could be a helpful for engineering decisions used in the designing and producing forceps and bowl cutter.

  11. Conceptual Models as Tools for Communication Across Disciplines

    Directory of Open Access Journals (Sweden)

    Marieke Heemskerk

    2003-12-01

    Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.

  12. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  13. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  14. Computational Tools for Modeling and Measuring Chromosome Structure

    Science.gov (United States)

    Ross, Brian Christopher

    DNA conformation within cells has many important biological implications, but there are challenges both in modeling DNA due to the need for specialized techniques, and experimentally since tracing out in vivo conformations is currently impossible. This thesis contributes two computational projects to these efforts. The first project is a set of online and offline calculators of conformational statistics using a variety of published and unpublished methods, addressing the current lack of DNA model-building tools intended for general use. The second project is a reconstructive analysis that could enable in vivo mapping of DNA conformation at high resolution with current experimental technology. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  15. Mathematical modelling: a tool for hospital infection control.

    Science.gov (United States)

    Grundmann, H; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has emerged that provides solid and testable hypotheses and opens the road to a quantitative assessment of the main obstructions that undermine current efforts to control the spread of health-care-associated infections in hospitals and communities. We aim to explain to a broader audience of professionals in health care, infection control, and health systems administration some of these models that can improve the understanding of the hidden dynamics of health-care-associated infections. We also appraise their usefulness and limitations as an innovative research and decision tool for control purposes.

  16. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    Science.gov (United States)

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  17. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    Science.gov (United States)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  18. Modeling as a research tool in poultry science.

    Science.gov (United States)

    Gous, R M

    2014-01-01

    The World's Poultry Science Association (WPSA) is a long-established and unique organization that strives to advance knowledge and understanding of all aspects of poultry science and the poultry industry. Its 3 main aims are education, organization, and research. The WPSA Keynote Lecture, titled "Modeling as a research tool in poultry science," addresses 2 of these aims, namely, the value of modeling in research and education. The role of scientists is to put forward and then to test theories. These theories, or models, may be simple or highly complex, but they are aimed at improving our understanding of a system or the interaction between systems. In developing a model, the scientist must take into account existing knowledge, and in this process gaps in our knowledge of a system are identified. Useful ideas for research are generated in this way, and experiments may be designed specifically to address these issues. The resultant models become more accurate and more useful, and can be used in education and extension as a means of explaining many of the complex issues that arise in poultry science.

  19. Introducing BioSARN - an ecological niche model refinement tool.

    Science.gov (United States)

    Heap, Marshall J

    2016-08-01

    Environmental niche modeling outputs a biological species' potential distribution. Further work is needed to arrive at a species' realized distribution. The Biological Species Approximate Realized Niche (BioSARN) application provides the ecological modeler with a toolset to refine Environmental niche models (ENMs). These tools include soil and land class filtering, niche area quantification and novelties like enhanced temporal corridor definition, and output to a high spatial resolution land class model. BioSARN is exemplified with a study on Fraser fir, a tree species with strong land class and edaphic correlations. Soil and land class filtering caused the potential distribution area to decline 17%. Enhanced temporal corridor definition permitted distinction of current, continuing, and future niches, and thus niche change and movement. Tile quantification analysis provided further corroboration of these trends. BioSARN does not substitute other established ENM methods. Rather, it allows the experimenter to work with their preferred ENM, refining it using their knowledge and experience. Output from lower spatial resolution ENMs to a high spatial resolution land class model is a pseudo high-resolution result. Still, it maybe the best that can be achieved until wide range high spatial resolution environmental data and accurate high precision species occurrence data become generally available.

  20. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  1. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    Science.gov (United States)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well

  2. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; Destefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  3. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  4. Modeling of tool path for the CNC sheet cutting machines

    Science.gov (United States)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  5. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  6. Standalone visualization tool for three-dimensional DRAGON geometrical models

    Energy Technology Data Exchange (ETDEWEB)

    Lukomski, A.; McIntee, B.; Moule, D.; Nichita, E. [Faculty of Energy Systems and Nuclear Science, Univ. of Ontario Inst. of Tech., Oshawa, Ontario (Canada)

    2008-07-01

    DRAGON is a neutron transport and depletion code able to solve one-, two- and three-dimensional problems. To date DRAGON provides two visualization modules, able to represent respectively two- and three-dimensional geometries. The two-dimensional visualization module generates a postscript file, while the three dimensional visualization module generates a MATLAB M-file with instructions for drawing the tracks in the DRAGON TRACKING data structure, which implicitly provide a representation of the geometry. The current work introduces a new, standalone, tool based on the open-source Visualization Toolkit (VTK) software package which allows the visualization of three-dimensional geometrical models by reading the DRAGON GEOMETRY data structure and generating an axonometric image which can be manipulated interactively by the user. (author)

  7. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  8. Planning the network of gas pipelines through modeling tools

    Energy Technology Data Exchange (ETDEWEB)

    Sucupira, Marcos L.L.; Lutif Filho, Raimundo B. [Companhia de Gas do Ceara (CEGAS), Fortaleza, CE (Brazil)

    2009-07-01

    Natural gas is a source of non-renewable energy used by different sectors of the economy of Ceara. Its use may be industrial, residential, commercial, as a source of automotive fuel, as a co-generation of energy and as a source for generating electricity from heat. For its practicality this energy has a strong market acceptance and provides a broad list of clients to fit their use, which makes it possible to reach diverse parts of the city. Its distribution requires a complex network of pipelines that branches throughout the city to meet all potential clients interested in this source of energy. To facilitate the design, analysis, expansion and location of bottlenecks and breaks in the distribution network, a modeling software is used that allows the network manager of the net to manage the various information about the network. This paper presents the advantages of modeling the gas distribution network of natural gas companies in Ceara, showing the tool used, the steps necessary for the implementation of the models, the advantages of using the software and the findings obtained with its use. (author)

  9. Development of hydrogeological modelling tools based on NAMMU

    Energy Technology Data Exchange (ETDEWEB)

    Marsic, N. [Kemakta Konsult AB, Stockholm (Sweden); Hartley, L.; Jackson, P.; Poole, M. [AEA Technology, Harwell (United Kingdom); Morvik, A. [Bergen Software Services International AS, Bergen (Norway)

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  10. Implementing an HL7 version 3 modeling tool from an Ecore model.

    Science.gov (United States)

    Bánfai, Balázs; Ulrich, Brandon; Török, Zsolt; Natarajan, Ravi; Ireland, Tim

    2009-01-01

    One of the main challenges of achieving interoperability using the HL7 V3 healthcare standard is the lack of clear definition and supporting tools for modeling, testing, and conformance checking. Currently, the knowledge defining the modeling is scattered around in MIF schemas, tools and specifications or simply with the domain experts. Modeling core HL7 concepts, constraints, and semantic relationships in Ecore/EMF encapsulates the domain-specific knowledge in a transparent way while unifying Java, XML, and UML in an abstract, high-level representation. Moreover, persisting and versioning the core HL7 concepts as a single Ecore context allows modelers and implementers to create, edit and validate message models against a single modeling context. The solution discussed in this paper is implemented in the new HL7 Static Model Designer as an extensible toolset integrated as a standalone Eclipse RCP application.

  11. Conversion of Rapid Prototyping Models into Metallic Tools by Ceramic Moulding—an Indirect Rapid Tooling Process

    Institute of Scientific and Technical Information of China (English)

    Teresa; P; DUARTE; J; M; FERREIRA; F; Jorge; LINO; A; BARBEDO; Rui; NETO

    2002-01-01

    A process to convert models made by rapid prototypi ng techniques like SL (stereolitography) and LOM (laminated object manufacturing) or by conventional techniques (silicones, resins, wax, etc.) into metallic mould s or tools has been developed. The main purpose of this technique is to rapidly obtain the first prototypes of parts, for plastics injection, forging or any oth er manufacturing process using the tools produced by casting a metal into a cera mic mould. Briefly, it can be said that the ceramic...

  12. M4AST - A Tool for Asteroid Modelling

    Science.gov (United States)

    Birlan, Mirel; Popescu, Marcel; Irimiea, Lucian; Binzel, Richard

    2016-10-01

    M4AST (Modelling for asteroids) is an online tool devoted to the analysis and interpretation of reflection spectra of asteroids in the visible and near-infrared spectral intervals. It consists into a spectral database of individual objects and a set of routines for analysis which address scientific aspects such as: taxonomy, curve matching with laboratory spectra, space weathering models, and mineralogical diagnosis. Spectral data were obtained using groundbased facilities; part of these data are precompiled from the literature[1].The database is composed by permanent and temporary files. Each permanent file contains a header and two or three columns (wavelength, spectral reflectance, and the error on spectral reflectance). Temporary files can be uploaded anonymously, and are purged for the property of submitted data. The computing routines are organized in order to accomplish several scientific objectives: visualize spectra, compute the asteroid taxonomic class, compare an asteroid spectrum with similar spectra of meteorites, and computing mineralogical parameters. One facility of using the Virtual Observatory protocols was also developed.A new version of the service was released in June 2016. This new release of M4AST contains a database and facilities to model more than 6,000 spectra of asteroids. A new web-interface was designed. This development allows new functionalities into a user-friendly environment. A bridge system of access and exploiting the database SMASS-MIT (http://smass.mit.edu) allows the treatment and analysis of these data in the framework of M4AST environment.Reference:[1] M. Popescu, M. Birlan, and D.A. Nedelcu, "Modeling of asteroids: M4AST," Astronomy & Astrophysics 544, EDP Sciences, pp. A130, 2012.

  13. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  14. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  15. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  16. Software tools overview : process integration, modelling and optimisation for energy saving and pollution reduction

    OpenAIRE

    Lam, Hon Loong; Klemeš, Jiri; Kravanja, Zdravko; Varbanov, Petar

    2012-01-01

    This paper provides an overview of software tools based on long experience andapplications in the area of process integration, modelling and optimisation. The first part reviews the current design practice and the development of supporting software tools. Those are categorised as: (1) process integration and retrofit analysis tools, (2) general mathematical modelling suites with optimisation libraries, (3) flowsheeting simulation and (4) graph-based process optimisation tools. The second part...

  17. An Executable Architecture Tool for the Modeling and Simulation of Operational Process Models

    Science.gov (United States)

    2015-03-16

    national coordination and iterative development. This paper includes a literature review, background information on process models and architecture...future work involves coordination with Subject Matter Experts ( SMEs ), and extracting data from experiments to assign more appropriate values. 3) Sub...development. This paper provided a brief description of other available tools, Fig. 10. Snapshot of Simulation Output Results for Example 3 background

  18. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  19. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  20. Improved Modeling Tools For High Speed Reacting Flows

    Science.gov (United States)

    2006-09-01

    putting the tools in place and operating them as a single system on the Beowulf cluster which was purposely built by Blue Blanket LLC (BBLLC) for this...a commercial tool, available from the Program Development Company (PDC). Computational Cluster An eight processor cluster was leased from BBLLC...SBIR I - FA8650-05-M-2594 3 Software Installation Once this cluster was in place, the off-the-shelf software was installed and tested

  1. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... and other analytical tools for conducting analyses for the planning, design, construction,...

  2. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  3. Designing a new tool for modeling and simulation of discrete-event based systems

    OpenAIRE

    2009-01-01

    This paper talks about design, development, and application of a new Petri net simulator for modeling and simulation of discrete event system (e.g. information systems). The new tool is called GPenSIM (General purpose Petri Net Simulator). Firstly, this paper presents the reason for developing a new tool, through a brief literature study. Secondly, the design and architectural issues of the tool is given. Finally, an application example is given on the application of the tool.

  4. Examining an important urban transportation management tool: subarea modeling

    Directory of Open Access Journals (Sweden)

    Xueming CHEN

    2009-12-01

    Full Text Available At present, customized subarea models have been widely used in local transportation planning throughout the United States. The biggest strengths of a subarea model lie in its more detailed and accurate modeling outputs which better meet local planning requirements. In addition, a subarea model can substantially reduce database size and model running time. In spite of these advantages, subarea models remain quite weak in maintaining consistency with a regional model, modeling transit projects, smart growth measures, air quality conformity, and other areas. Both opportunities and threats exist for subarea modeling. In addition to examining subarea models, this paper introduces the decision-making process in choosing a proper subarea modeling approach (windowing versus focusing and software package. This study concludes that subarea modeling will become more popular in the future. More GIS applications, travel surveys, transit modeling, microsimulation software utilization, and other modeling improvements are expected to be incorporated into the subarea modeling process.

  5. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  6. Ergonomics applications of a mechanical model of the human operator in power hand tool operation.

    Science.gov (United States)

    Lin, Jia-Hua; Radwin, Robert; Nembhard, David

    2005-02-01

    Applications of a new model for predicting power threaded-fastener-driving tool operator response and capacity to react against impulsive torque reaction forces are explored for use in tool selection and ergonomic workplace design. The model is based on a mechanical analog of the human operator, with parameters dependent on work location (horizontal and vertical distances); work orientation (horizontal and vertical); and tool shape (in-line, pistol grip, and right angle); and is stratified by gender. This model enables prediction of group means and variances of handle displacement and force for a given tool configuration. Response percentiles can be ascertained for specific tool operations. For example, a sample pistol grip nutrunner used on a horizontal surface at 30 cm in front of the ankles and 140 cm above the floor results in a predicted mean handle reaction displacement of 39.0 (SD=28.1) mm for males. Consequently 63%of the male users exceed a 30 mm handle displacement limit. When a right angle tool of similar torque output is used instead, the model predicted that only 4.6%of the male tool users exceed a 30 mm handle displacement. A method is described for interpolating individual subject model parameters at any given work location using linear combinations in relation to the range of modeled factors. Additional examples pertinent to ergonomic workstation design and tool selection are provided to demonstrate how the model can be used to aid tool selection and workstation design.

  7. Measurement Model for Division as a Tool in Computing Applications

    Science.gov (United States)

    Abramovich, Sergei; Strock, Tracy

    2002-01-01

    The paper describes the use of a spreadsheet in a mathematics teacher education course. It shows how the tool can serve as a link between seemingly disconnected mathematical concepts. The didactical triad of using a spreadsheet as an agent, consumer, and amplifier of mathematical activities allows for an extended investigation of simple yet…

  8. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid;

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or speed...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  9. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  10. An Energy Systems Modelling Tool for the Social Simulation Community

    NARCIS (Netherlands)

    Bollinger, L. Andrew; van Blijswijk, Martti J.; Dijkema, Gerard P.J.; Nikolic, Igor

    2016-01-01

    The growing importance of links between the social and technical dimensions of the electricity infrastructure mean that many research problems cannot be effectively addressed without joint consideration of social and technical dynamics. This paper motivates the need for and introduces a tool to faci

  11. A practical tool for modeling biospecimen user fees.

    Science.gov (United States)

    Matzke, Lise; Dee, Simon; Bartlett, John; Damaraju, Sambasivarao; Graham, Kathryn; Johnston, Randal; Mes-Masson, Anne-Marie; Murphy, Leigh; Shepherd, Lois; Schacter, Brent; Watson, Peter H

    2014-08-01

    The question of how best to attribute the unit costs of the annotated biospecimen product that is provided to a research user is a common issue for many biobanks. Some of the factors influencing user fees are capital and operating costs, internal and external demand and market competition, and moral standards that dictate that fees must have an ethical basis. It is therefore important to establish a transparent and accurate costing tool that can be utilized by biobanks and aid them in establishing biospecimen user fees. To address this issue, we built a biospecimen user fee calculator tool, accessible online at www.biobanking.org . The tool was built to allow input of: i) annual operating and capital costs; ii) costs categorized by the major core biobanking operations; iii) specimen products requested by a biobank user; and iv) services provided by the biobank beyond core operations (e.g., histology, tissue micro-array); as well as v) several user defined variables to allow the calculator to be adapted to different biobank operational designs. To establish default values for variables within the calculator, we first surveyed the members of the Canadian Tumour Repository Network (CTRNet) management committee. We then enrolled four different participants from CTRNet biobanks to test the hypothesis that the calculator tool could change approaches to user fees. Participants were first asked to estimate user fee pricing for three hypothetical user scenarios based on their biobanking experience (estimated pricing) and then to calculate fees for the same scenarios using the calculator tool (calculated pricing). Results demonstrated significant variation in estimated pricing that was reduced by calculated pricing, and that higher user fees are consistently derived when using the calculator. We conclude that adoption of this online calculator for user fee determination is an important first step towards harmonization and realistic user fees.

  12. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    Science.gov (United States)

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-01-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school…

  13. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  14. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  15. A Model for Developing Meta-Cognitive Tools in Teacher Apprenticeships

    Science.gov (United States)

    Bray, Paige; Schatz, Steven

    2013-01-01

    This research investigates a model for developing meta-cognitive tools to be used by pre-service teachers during apprenticeship (student teaching) experience to operationalise the epistemological model of Cook and Brown (2009). Meta-cognitive tools have proven to be effective for increasing performance and retention of undergraduate students.…

  16. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  17. Tools and data for the geochemical modeling. Thermodynamic data for sulfur species and background salts and tools for the uncertainty analysis; WEDA. Werkzeuge und Daten fuer die Geochemische Modellierung. Thermodynamische Daten fuer Schwefelspezies und Hintergrundsalze sowie Tools zur Unsicherheitsanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Hagemann, Sven; Schoenwiese, Dagmar; Scharge, Tina

    2015-07-15

    The report on tools and data for the geochemical modeling covers the following issues: experimental methods and theoretical models, design of a thermodynamic model for reduced sulfur species, thermodynamic models for background salts, tools for the uncertainty and sensitivity analyses of geochemical equilibrium modeling.

  18. Modeling Heterogeneity in Networks using Uncertainty Quantification Tools

    CERN Document Server

    Rajendran, Karthikeyan; Siettos, Constantinos I; Laing, Carlo R; Kevrekidis, Ioannis G

    2015-01-01

    Using the dynamics of information propagation on a network as our illustrative example, we present and discuss a systematic approach to quantifying heterogeneity and its propagation that borrows established tools from Uncertainty Quantification. The crucial assumption underlying this mathematical and computational "technology transfer" is that the evolving states of the nodes in a network quickly become correlated with the corresponding node "identities": features of the nodes imparted by the network structure (e.g. the node degree, the node clustering coefficient). The node dynamics thus depend on heterogeneous (rather than uncertain) parameters, whose distribution over the network results from the network structure. Knowing these distributions allows us to obtain an efficient coarse-grained representation of the network state in terms of the expansion coefficients in suitable orthogonal polynomials. This representation is closely related to mathematical/computational tools for uncertainty quantification (th...

  19. MQ-2 A Tool for Prolog-based Model Querying

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald

    2012-01-01

    MQ-2 integrates a Prolog console into the MagicDraw1 modeling environment and equips this console with features targeted specifically to the task of querying models. The vision of MQ-2 is to make Prolog-based model querying accessible to both student and expert modelers by offering powerful query...

  20. A unified tool for performance modelling and prediction

    Energy Technology Data Exchange (ETDEWEB)

    Gilmore, Stephen [Laboratory for Foundations of Computer Science, University of Edinburgh, King' s Buildings, Mayfield Road, Edinburgh, Scotland EH9 3JZ (United Kingdom)]. E-mail: stg@inf.ed.ac.uk; Kloul, Leila [Laboratory for Foundations of Computer Science, University of Edinburgh, King' s Buildings, Mayfield Road, Edinburgh, Scotland EH9 3JZ (United Kingdom)

    2005-07-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony.

  1. Solid-state-drives (SSDs) modeling simulation tools & strategies

    CERN Document Server

    2017-01-01

    This book introduces simulation tools and strategies for complex systems of solid-state-drives (SSDs) which consist of a flash multi-core microcontroller plus NAND flash memories. It provides a broad overview of the most popular simulation tools, with special focus on open source solutions. VSSIM, NANDFlashSim and DiskSim are benchmarked against performances of real SSDs under different traffic workloads. PROs and CONs of each simulator are analyzed, and it is clearly indicated which kind of answers each of them can give and at a what price. It is explained, that speed and precision do not go hand in hand, and it is important to understand when to simulate what, and with which tool. Being able to simulate SSD’s performances is mandatory to meet time-to-market, together with product cost and quality. Over the last few years the authors developed an advanced simulator named “SSDExplorer” which has been used to evaluate multiple phenomena with great accuracy, from QoS (Quality Of Service) to Read Retry, fr...

  2. Implementing the Mother-Baby Model of Nursing Care Using Models and Quality Improvement Tools.

    Science.gov (United States)

    Brockman, Vicki

    As family-centered care has become the expected standard, many facilities follow the mother-baby model, in which care is provided to both a woman and her newborn in the same room by the same nurse. My facility employed a traditional model of nursing care, which was not evidence-based or financially sustainable. After implementing the mother-baby model, we experienced an increase in exclusive breastfeeding rates at hospital discharge, increased patient satisfaction, improved staff productivity and decreased salary costs, all while the number of births increased. Our change was successful because it was guided by the use of quality improvement tools, change theory and evidence-based practice models. © 2015 AWHONN.

  3. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2016-06-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  4. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  5. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  6. The Quantum Atomic Model "Electronium": A Successful Teaching Tool.

    Science.gov (United States)

    Budde, Marion; Niedderer, Hans; Scott, Philip; Leach, John

    2002-01-01

    Focuses on the quantum atomic model Electronium. Outlines the Bremen teaching approach in which this model is used, and analyzes the learning of two students as they progress through the teaching unit. (Author/MM)

  7. Tools for modeling radioactive contaminants in chip materials

    Science.gov (United States)

    Wrobel, F.; Kaouache, A.; Saigné, F.; Touboul, A. D.; Schrimpf, R. D.; Warot, G.; Bruguier, O.

    2017-03-01

    Radioactive pollutants are naturally present in microelectronic device materials and can be an issue for the reliability of devices. The main concern is alpha emitters that produce high-energy particles (a few MeV) that ionize the semiconductor and then trigger soft errors. The question is to know what kinds of radionuclides are present in the device, their location in the device and the abundance of each species. In this paper we describe tools that are required to address the issue of radioactive pollutants in electronic devices.

  8. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  9. Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics

    Science.gov (United States)

    2008-10-01

    AFRL-RH-WP-TR-2009-0110 Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics...SUBTITLE Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics 5a. CONTRACT NUMBER FA8650-07-1-6848...8 1 1.0 SUMMARY This final technical report describes the research findings of the project Simulation Modeling and Statistical Network

  10. WeedML: a Tool for Collaborative Weed Demographic Modeling

    OpenAIRE

    Holst, Niels

    2010-01-01

    WeedML is a proposed standard to formulate models of weed demography, or maybe even complex models in general, that are both transparent and straightforward to re-use as building blocks for new models. The paper describes the design and thoughts behind WeedML which relies on XML and object-oriented systems development. Proof-of-concept software is provided as open-source C++ code and executables that can be downloaded freely.

  11. Modeling of Tool Wear in Vibration Assisted Nano Impact-Machining by Loose Abrasives

    Directory of Open Access Journals (Sweden)

    Sagil James

    2014-01-01

    Full Text Available Vibration assisted nano impact-machining by loose abrasives (VANILA is a novel nanomachining process that combines the principles of vibration assisted abrasive machining and tip-based nanomachining, to perform target specific nanoabrasive machining of hard and brittle materials. An atomic force microscope (AFM is used as a platform in this process wherein nanoabrasives, injected in slurry between the workpiece and the vibrating AFM probe which is the tool, impact the workpiece and cause nanoscale material removal. The VANILA process are conducted such that the tool tip does not directly contact the workpiece. The level of precision and quality of the machined features in a nanomachining process is contingent on the tool wear which is inevitable. Initial experimental studies have demonstrated reduced tool wear in the VANILA process as compared to indentation process in which the tool directly contacts the workpiece surface. In this study, the tool wear rate during the VANILA process is analytically modeled considering impacts of abrasive grains on the tool tip surface. Experiments are conducted using several tools in order to validate the predictions of the theoretical model. It is seen that the model is capable of accurately predicting the tool wear rate within 10% deviation.

  12. Towards diagnostic tools for analysing Swarm data through model retrievals

    DEFF Research Database (Denmark)

    Kotsiaros, Stavros; Plank, Gernot; Haagmans, R.

    The objective of the Swarm mission is to provide the best ever survey of the geomagnetic field and its temporal dependency, and to gain new insights into improving our knowledge of the Earth’s interior and climate. The Swarm concept consists of a constellation of three satellites in three different...... polar orbits between 300 and 550 km altitude. Goal of the current study is to build tools and to analyze datasets, in order to allow a fast diagnosis of the Swarm system performance in orbit during the commission phase and operations of the spacecraft. The effects on the reconstruction of the magnetic...... field resulting from various error sources are investigated. By using a specially developed software package closed loop simulations are performed aiming at different scenarios. We start from the simple noise-free case and move on to more complex and realistic situations which include attitude errors...

  13. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  14. KINEROS2 – AGWA Suite of Modeling Tools

    Science.gov (United States)

    KINEROS2 (K2) originated in the 1960s as a distributed event-based rainfall-runoff erosion model abstracting the watershed as a cascade of overland flow elements contributing to channel model elements. Development and improvement of K2 has continued for a variety of projects and ...

  15. Using a Parametric Solid Modeler as an Instructional Tool

    Science.gov (United States)

    Devine, Kevin L.

    2008-01-01

    This paper presents the results of a quasi-experimental study that brought 3D constraint-based parametric solid modeling technology into the high school mathematics classroom. This study used two intact groups; a control group and an experimental group, to measure the extent to which using a parametric solid modeler during instruction affects…

  16. Computerized models : tools for assessing the future of complex systems?

    NARCIS (Netherlands)

    Ittersum, van M.K.; Sterk, B.

    2015-01-01

    Models are commonly used to make decisions. At some point all of us will have employed a mental model, that is, a simplification of reality, in an everyday situation. For instance, when we want to make the best decision for the environment and consider whether to buy our vegetables in a large

  17. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to unders

  18. Homology Modeling a Fast Tool for Drug Discovery: Current Perspectives

    Science.gov (United States)

    Vyas, V. K.; Ukawala, R. D.; Ghate, M.; Chintha, C.

    2012-01-01

    Major goal of structural biology involve formation of protein-ligand complexes; in which the protein molecules act energetically in the course of binding. Therefore, perceptive of protein-ligand interaction will be very important for structure based drug design. Lack of knowledge of 3D structures has hindered efforts to understand the binding specificities of ligands with protein. With increasing in modeling software and the growing number of known protein structures, homology modeling is rapidly becoming the method of choice for obtaining 3D coordinates of proteins. Homology modeling is a representation of the similarity of environmental residues at topologically corresponding positions in the reference proteins. In the absence of experimental data, model building on the basis of a known 3D structure of a homologous protein is at present the only reliable method to obtain the structural information. Knowledge of the 3D structures of proteins provides invaluable insights into the molecular basis of their functions. The recent advances in homology modeling, particularly in detecting and aligning sequences with template structures, distant homologues, modeling of loops and side chains as well as detecting errors in a model contributed to consistent prediction of protein structure, which was not possible even several years ago. This review focused on the features and a role of homology modeling in predicting protein structure and described current developments in this field with victorious applications at the different stages of the drug design and discovery. PMID:23204616

  19. MuscleBuilder:A Modeling Tool for Human Anatomy

    Institute of Scientific and Technical Information of China (English)

    Amaury Aubel; Daniel Thalmann

    2004-01-01

    A traditional multi-layered approach is adopted to human body modeling and deformation. The model is split into three general anatomical structures: the skeleton, musculature and skin. It is shown that each of these layers is modeled and deformed by using fast, procedural, ad-hoc methods that can painlessly be reimplemented. The modeling approach is generic enough to handle muscles of varying shape, size and characteristics and does not break in extreme skeleton poses. It is also described that the integrated MuscleBuilder system whose main features are: i) easy and quick creation of muscle deformation models; ii) automatic deformation of an overlying skin. It is shown that visually realistic results can be obtained at interactive frame rates with very little input from the designer.

  20. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing;

    2011-01-01

    This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools....... To identify promising models (or processes) for which the tools have not yet been constructed and start filling up these gaps. To propose ways to streamline the process of going from models to events, i.e. to make the process more user-friendly so that more people can get involved and perform serious collider...

  1. The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool

    Science.gov (United States)

    1988-07-01

    AARL-SR-90-512 AD-A235 930l[liill ~i 11111111111 iIII J The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool Louise A...R 4. TITLE AND SUBTITLE S. FUNDING NUMBERS The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool PE 62202F 6. AUTHOR(S) PR...Lagrange method. In this paper the use of the ATH model as a robot dynamics simulation tool is discussed and various simulations are demonstrated. For this

  2. Application of Krylov Reduction Technique for a Machine Tool Multibody Modelling

    Directory of Open Access Journals (Sweden)

    M. Sulitka

    2014-02-01

    Full Text Available Quick calculation of machine tool dynamic response represents one of the major requirements for machine tool virtual modelling and virtual machining, aiming at simulating the machining process performance, quality, and precision of a workpiece. Enhanced time effectiveness in machine tool dynamic simulations may be achieved by employing model order reduction (MOR techniques of the full finite element (FE models. The paper provides a case study aimed at comparison of Krylov subspace base and mode truncation technique. Application of both of the reduction techniques for creating a machine tool multibody model is evaluated. The Krylov subspace reduction technique shows high quality in terms of both dynamic properties of the reduced multibody model and very low time demands at the same time.

  3. CPS Modeling of CNC Machine Tool Work Processes Using an Instruction-Domain Based Approach

    Directory of Open Access Journals (Sweden)

    Jihong Chen

    2015-06-01

    Full Text Available Building cyber-physical system (CPS models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control (CNC system during the work processes of a CNC machine tool is the main source of the big data on which a CPS model is established. In this work-process model, a method based on instruction domain is applied to analyze the electronic big data, and a quantitative description of the numerical control (NC processes is built according to the G code of the processes. Utilizing the instruction domain, a work-process CPS model is established on the basis of the accurate, real-time mapping of the manufacturing tasks, resources, and status of the CNC machine tool. Using such models, case studies are conducted on intelligent-machining applications, such as the optimization of NC processing parameters and the health assurance of CNC machine tools.

  4. Ecotoxicological mechanisms and models in an impact analysis tool for oil spills

    NARCIS (Netherlands)

    Laender, de F.; Olsen, G.H.; Frost, T.; Grosvik, B.E.; Klok, T.C.

    2011-01-01

    In an international collaborative effort, an impact analysis tool is being developed to predict the effect of accidental oil spills on recruitment and production of Atlantic cod (Gadus morhua) in the Barents Sea. The tool consisted of three coupled ecological models that describe (1) plankton biomas

  5. Ecotoxicological mechanisms and models in an impact analysis tool for oil spills

    NARCIS (Netherlands)

    Laender, de F.; Olsen, G.H.; Frost, T.; Grosvik, B.E.; Klok, T.C.

    2011-01-01

    In an international collaborative effort, an impact analysis tool is being developed to predict the effect of accidental oil spills on recruitment and production of Atlantic cod (Gadus morhua) in the Barents Sea. The tool consisted of three coupled ecological models that describe (1) plankton

  6. Static Stiffness Modeling of a Novel PKM-Machine Tool Structure

    Directory of Open Access Journals (Sweden)

    O. K. Akmaev

    2014-07-01

    Full Text Available This article presents a new configuration of a 3-dof machine tool with parallel kinematics. Elastic deformations of the machine tool have been modeled with finite elements, stiffness coefficients at characteristic points of the working area for different cutting forces have been calculated.

  7. An Integrated Simulation Tool for Modeling the Human Circulatory System

    Science.gov (United States)

    Asami, Ken'ichi; Kitamura, Tadashi

    This paper presents an integrated simulation of the circulatory system in physiological movement. The large circulatory system model includes principal organs and functional units in modules in which comprehensive physiological changes such as nerve reflexes, temperature regulation, acid/base balance, O2/CO2 balance, and exercise are simulated. A beat-by-beat heart model, in which the corresponding electrical circuit problems are solved by a numerical analytic method, enables calculation of pulsatile blood flow to the major organs. The integration of different perspectives on physiological changes makes this simulation model applicable for the microscopic evaluation of blood flow under various conditions in the human body.

  8. Econometric Model – A Tool in Financial Management

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2011-06-01

    Full Text Available The economic situation in Romania requires from the trader a rigorous analysis of vulnerabilities and opportunities offered by the external environment and a careful analysis of internal environmental conditions in which the entity operates. In this context particular attention is paid to indicators presented in the financial statements. Many times they are a model for economic forecasts, future plans, basic business and businesses that use them with a good forecasting activity. In this paper we propose to analyze the comparative evolution of the main financial indicators highlighted in financial statements (profit and loss through a multi-equation econometric model, namely dynamic Keynesian model.

  9. An Introduction to Model Selection: Tools and Algorithms

    Directory of Open Access Journals (Sweden)

    Sébastien Hélie

    2006-03-01

    Full Text Available Model selection is a complicated matter in science, and psychology is no exception. In particular, the high variance in the object of study (i.e., humans prevents the use of Popper’s falsification principle (which is the norm in other sciences. Therefore, the desirability of quantitative psychological models must be assessed by measuring the capacity of the model to fit empirical data. In the present paper, an error measure (likelihood, as well as five methods to compare model fits (the likelihood ratio test, Akaike’s information criterion, the Bayesian information criterion, bootstrapping and cross-validation, are presented. The use of each method is illustrated by an example, and the advantages and weaknesses of each method are also discussed.

  10. Modeling and Calculator Tools for State and Local Transportation Resources

    Science.gov (United States)

    Air quality models, calculators, guidance and strategies are offered for estimating and projecting vehicle air pollution, including ozone or smog-forming pollutants, particulate matter and other emissions that pose public health and air quality concerns.

  11. Tools and Algorithms to Link Horizontal Hydrologic and Vertical Hydrodynamic Models and Provide a Stochastic Modeling Framework

    Science.gov (United States)

    Salah, Ahmad M.; Nelson, E. James; Williams, Gustavious P.

    2010-04-01

    We present algorithms and tools we developed to automatically link an overland flow model to a hydrodynamic water quality model with different spatial and temporal discretizations. These tools run the linked models which provide a stochastic simulation frame. We also briefly present the tools and algorithms we developed to facilitate and analyze stochastic simulations of the linked models. We demonstrate the algorithms by linking the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model for overland flow with the CE-QUAL-W2 model for water quality and reservoir hydrodynamics. GSSHA uses a two-dimensional horizontal grid while CE-QUAL-W2 uses a two-dimensional vertical grid. We implemented the algorithms and tools in the Watershed Modeling System (WMS) which allows modelers to easily create and use models. The algorithms are general and could be used for other models. Our tools create and analyze stochastic simulations to help understand uncertainty in the model application. While a number of examples of linked models exist, the ability to perform automatic, unassisted linking is a step forward and provides the framework to easily implement stochastic modeling studies.

  12. Tools and Algorithms to Link Horizontal Hydrologic and Vertical Hydrodynamic Models and Provide a Stochastic Modeling Framework

    Directory of Open Access Journals (Sweden)

    Ahmad M Salah

    2010-12-01

    Full Text Available We present algorithms and tools we developed to automatically link an overland flow model to a hydrodynamic water quality model with different spatial and temporal discretizations. These tools run the linked models which provide a stochastic simulation frame. We also briefly present the tools and algorithms we developed to facilitate and analyze stochastic simulations of the linked models. We demonstrate the algorithms by linking the Gridded Surface Subsurface Hydrologic Analysis (GSSHA model for overland flow with the CE-QUAL-W2 model for water quality and reservoir hydrodynamics. GSSHA uses a two-dimensional horizontal grid while CE-QUAL-W2 uses a two-dimensional vertical grid. We implemented the algorithms and tools in the Watershed Modeling System (WMS which allows modelers to easily create and use models. The algorithms are general and could be used for other models. Our tools create and analyze stochastic simulations to help understand uncertainty in the model application. While a number of examples of linked models exist, the ability to perform automatic, unassisted linking is a step forward and provides the framework to easily implement stochastic modeling studies.

  13. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    Marques, W. C., E. H. L. Fernandes, B. C. Moraes, O. O. Möller, and A. Malcherek (2010), Dynamics of the Patos Lagoon coastal plume and its...multiple hurricane beds in the northern Gulf of Mexico , Marine Geology, Volume 210, Issues 1-4, Storms and their significance in coastal morpho-sedimentary...accuracy of model forecasts of currents in coastal areas. The MVV module is implemented as part of the Geospatial Analysis and Model Evaluation Software

  14. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    Science.gov (United States)

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  15. SARAH 4: A tool for (not only SUSY) model builders

    CERN Document Server

    Staub, Florian

    2013-01-01

    We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@te. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.

  16. SARAH 4: A tool for (not only SUSY) model builders

    Science.gov (United States)

    Staub, Florian

    2014-06-01

    We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@TE. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.

  17. Computational Modeling as a Design Tool in Microelectronics Manufacturing

    Science.gov (United States)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    Plans to introduce pilot lines or fabs for 300 mm processing are in progress. The IC technology is simultaneously moving towards 0.25/0.18 micron. The convergence of these two trends places unprecedented stringent demands on processes and equipments. More than ever, computational modeling is called upon to play a complementary role in equipment and process design. The pace in hardware/process development needs a matching pace in software development: an aggressive move towards developing "virtual reactors" is desirable and essential to reduce design cycle and costs. This goal has three elements: reactor scale model, feature level model, and database of physical/chemical properties. With these elements coupled, the complete model should function as a design aid in a CAD environment. This talk would aim at the description of various elements. At the reactor level, continuum, DSMC(or particle) and hybrid models will be discussed and compared using examples of plasma and thermal process simulations. In microtopography evolution, approaches such as level set methods compete with conventional geometric models. Regardless of the approach, the reliance on empricism is to be eliminated through coupling to reactor model and computational surface science. This coupling poses challenging issues of orders of magnitude variation in length and time scales. Finally, database development has fallen behind; current situation is rapidly aggravated by the ever newer chemistries emerging to meet process metrics. The virtual reactor would be a useless concept without an accompanying reliable database that consists of: thermal reaction pathways and rate constants, electron-molecule cross sections, thermochemical properties, transport properties, and finally, surface data on the interaction of radicals, atoms and ions with various surfaces. Large scale computational chemistry efforts are critical as experiments alone cannot meet database needs due to the difficulties associated with such

  18. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wild land fire spread and behavior are complex phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-Fire- Chem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  19. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J.L.; Perez, L.; Gonzalez, R.M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wildland fire spread and behavior are complex Phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-FireChem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  20. Cooperative development of logical modelling standards and tools with CoLoMoTo.

    Science.gov (United States)

    Naldi, Aurélien; Monteiro, Pedro T; Müssel, Christoph; Kestler, Hans A; Thieffry, Denis; Xenarios, Ioannis; Saez-Rodriguez, Julio; Helikar, Tomas; Chaouiya, Claudine

    2015-04-01

    The identification of large regulatory and signalling networks involved in the control of crucial cellular processes calls for proper modelling approaches. Indeed, models can help elucidate properties of these networks, understand their behaviour and provide (testable) predictions by performing in silico experiments. In this context, qualitative, logical frameworks have emerged as relevant approaches, as demonstrated by a growing number of published models, along with new methodologies and software tools. This productive activity now requires a concerted effort to ensure model reusability and interoperability between tools. Following an outline of the logical modelling framework, we present the most important achievements of the Consortium for Logical Models and Tools, along with future objectives. Our aim is to advertise this open community, which welcomes contributions from all researchers interested in logical modelling or in related mathematical and computational developments.

  1. The synergy professional practice model and its patient characteristics tool: a staff empowerment strategy.

    Science.gov (United States)

    MacPhee, Maura; Wardrop, Andrea; Campbell, Cheryl; Wejr, Patricia

    2011-10-01

    Nurse leaders can positively influence practice environments through a number of empowerment strategies, among them professional practice models. These models encompass the philosophy, structures and processes that support nurses' control over their practice and their voice within healthcare organizations. Nurse-driven professional practice models can serve as a framework for collaborative decision-making among nursing and other staff. This paper describes a provincewide pilot project in which eight nurse-led project teams in four healthcare sectors worked with the synergy professional practice model and its patient characteristics tool. The teams learned how the model and tool can be used to classify patients' acuity levels and make staffing assignments based on a "best fit" between patient needs and staff competencies. The patient characteristics tool scores patients' acuities on eight characteristics such as stability, vulnerability and resource availability. This tool can be used to make real-time patient assessments. Other potential applications for the model and tool are presented, such as care planning, team-building and determining appropriate staffing levels. Our pilot project evidence suggests that the synergy model and its patient characteristics tool may be an empowerment strategy that nursing leaders can use to enhance their practice environments.

  2. CRISPR-Cas9: A Revolutionary Tool for Cancer Modelling

    Directory of Open Access Journals (Sweden)

    Raul Torres-Ruiz

    2015-09-01

    Full Text Available The cancer-modelling field is now experiencing a conversion with the recent emergence of the RNA-programmable CRISPR-Cas9 system, a flexible methodology to produce essentially any desired modification in the genome. Cancer is a multistep process that involves many genetic mutations and other genome rearrangements. Despite their importance, it is difficult to recapitulate the degree of genetic complexity found in patient tumors. The CRISPR-Cas9 system for genome editing has been proven as a robust technology that makes it possible to generate cellular and animal models that recapitulate those cooperative alterations rapidly and at low cost. In this review, we will discuss the innovative applications of the CRISPR-Cas9 system to generate new models, providing a new way to interrogate the development and progression of cancers.

  3. CRISPR-Cas9: A Revolutionary Tool for Cancer Modelling.

    Science.gov (United States)

    Torres-Ruiz, Raul; Rodriguez-Perales, Sandra

    2015-09-14

    The cancer-modelling field is now experiencing a conversion with the recent emergence of the RNA-programmable CRISPR-Cas9 system, a flexible methodology to produce essentially any desired modification in the genome. Cancer is a multistep process that involves many genetic mutations and other genome rearrangements. Despite their importance, it is difficult to recapitulate the degree of genetic complexity found in patient tumors. The CRISPR-Cas9 system for genome editing has been proven as a robust technology that makes it possible to generate cellular and animal models that recapitulate those cooperative alterations rapidly and at low cost. In this review, we will discuss the innovative applications of the CRISPR-Cas9 system to generate new models, providing a new way to interrogate the development and progression of cancers.

  4. Model-Based Design Tools for Extending COTS Components To Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this project is model-based design (MBD) tools for predicting the performance and useful life of commercial-off-the-shelf (COTS) components and...

  5. Model-Based Design Tools for Extending COTS Components To Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this Phase I project is to prove the feasibility of using model-based design (MBD) tools to predict the performance and useful life of...

  6. Physics-based Modeling Tools for Life Prediction and Durability Assessment of Advanced Materials Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The technical objectives of this program are: (1) to develop a set of physics-based modeling tools to predict the initiation of hot corrosion and to address pit and...

  7. The Integrated Medical Model: A Decision Support Tool for In-flight Crew Health Care

    Science.gov (United States)

    Butler, Doug

    2009-01-01

    This viewgraph presentation reviews the development of an Integrated Medical Model (IMM) decision support tool for in-flight crew health care safety. Clinical methods, resources, and case scenarios are also addressed.

  8. Physical Modeling of Contact Processes on the Cutting Tools Surfaces of STM When Turning

    Science.gov (United States)

    Belozerov, V. A.; Uteshev, M. H.

    2016-08-01

    This article describes how to create an optimization model of the process of fine turning of superalloys and steel tools from STM on CNC machines, flexible manufacturing units (GPM), machining centers. Creation of the optimization model allows you to link (unite) contact processes simultaneously on the front and back surfaces of the tool from STM to manage contact processes and the dynamic strength of the cutting tool at the top of the STM. Established optimization model of management of the dynamic strength of the incisors of the STM in the process of fine turning is based on a previously developed thermomechanical (physical, heat) model, which allows the system thermomechanical approach to choosing brands STM (domestic and foreign) for cutting tools from STM designed for fine turning of heat resistant alloys and steels.

  9. Multi-Physics Computational Modeling Tool for Materials Damage Assessment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is to provide a multi-physics modeling tool for materials damage assessment for application to future aircraft design. The software...

  10. The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods

    NARCIS (Netherlands)

    Verpoorten, Dominique; Poumay, M; Leclercq, D

    2006-01-01

    Please, cite this publication as: Verpoorten, D., Poumay, M., & Leclercq, D. (2006). The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods. Proceedings of International Workshop in Learning Networks for Lifelong Competence Development, TENCompetence

  11. High Performance Computing tools for the Integrated Tokamak Modelling project

    Energy Technology Data Exchange (ETDEWEB)

    Guillerminet, B., E-mail: bernard.guillerminet@cea.f [Association Euratom-CEA sur la Fusion, IRFM, DSM, CEA Cadarache (France); Plasencia, I. Campos [Instituto de Fisica de Cantabria (IFCA), CSIC, Santander (Spain); Haefele, M. [Universite Louis Pasteur, Strasbourg (France); Iannone, F. [EURATOM/ENEA Fusion Association, Frascati (Italy); Jackson, A. [University of Edinburgh (EPCC) (United Kingdom); Manduchi, G. [EURATOM/ENEA Fusion Association, Padova (Italy); Plociennik, M. [Poznan Supercomputing and Networking Center (PSNC) (Poland); Sonnendrucker, E. [Universite Louis Pasteur, Strasbourg (France); Strand, P. [Chalmers University of Technology (Sweden); Owsiak, M. [Poznan Supercomputing and Networking Center (PSNC) (Poland)

    2010-07-15

    Fusion Modelling and Simulation are very challenging and the High Performance Computing issues are addressed here. Toolset for jobs launching and scheduling, data communication and visualization have been developed by the EUFORIA project and used with a plasma edge simulation code.

  12. Cognitive Bargaining Model: An Analysis Tool for Third Party Incentives?

    Science.gov (United States)

    2009-12-01

    A. MIXING APPLES AND ORANGES?.........................................................20 B. MODEL CONSTRUCTION...international community for years to come. From the foundational research /writing and comparative politics instruction from Professor Lawson to a...N. Haass and Meghan L. O’Sullivan, Honey and Vinegar (Brookings Institute Press: Washington, D.C., 2000), 162. 7 (economic or strategically) also

  13. Modeling mind-wandering: a tool to better understand distraction

    NARCIS (Netherlands)

    van Vugt, Marieke; Taatgen, Niels; Sackur, Jerome; Bastian, Mikael; Taatgen, Niels; van Vugt, Marieke; Borst, Jelmer; Mehlhorn, Katja

    2015-01-01

    When we get distracted, we may engage in mind-wandering, or task-unrelated thinking, which impairs performance on cognitive tasks. Yet, we do not have cognitive models that make this process explicit. On the basis of both recent experiments that have started to investigate mind-wandering and introsp

  14. Verifying OCL specifications of UML models : tool support and compositionality

    NARCIS (Netherlands)

    Kyas, Marcel

    2006-01-01

    The Unified Modelling Language (UML) and the Object Constraint Language (OCL) serve as specification languages for embedded and real-time systems used in a safety-critical environment. In this dissertation class diagrams, object diagrams, and OCL constraints are formalised. The formalisation serve

  15. Modeling tools to Account for Ethanol Impacts on BTEX Plumes

    Science.gov (United States)

    Widespread usage of ethanol in gasoline leads to impacts at leak sites which differ from those of non-ethanol gasolines. The presentation reviews current research results on the distribution of gasoline and ethanol, biodegradation, phase separation and cosolvancy. Model results f...

  16. Enhancing Technology-Mediated Communication: Tools, Analyses, and Predictive Models

    Science.gov (United States)

    2007-09-01

    the home (see, for example, Nagel, Hudson, & Abowd, 2004), in social Chapter 2: Background 17 settings (see Kern, Antifakos, Schiele ...on Computer Supported Cooperative Work (CSCW 2006), pp. 525-528 ACM Press. Kern, N., Antifakos, S., Schiele , B., & Schwaninger, A. (2004). A model

  17. Inverse thermal history modelling as a hydrocarbon exploration tool

    Energy Technology Data Exchange (ETDEWEB)

    Gallagher, K. [Imperial College of Science, Technology and Medicine, London (United Kingdom). TH Huxley School of Environment, Earth Science and Engineering

    1998-12-31

    Thermal history modelling is a significant part of hydrocarbon exploration and resource assessment. Its primary use is to predict the volume and timing of hydrocarbon generation as a sedimentary basin evolves on timescales of 10{sup 7}-10{sup 8} years. Forward modelling is commonly used to constrain the thermal history in sedimentary basins. Alternatively, inversion schemes may be used which have many advantages over the conventional forward modelling approach. An example of an inversion approach is presented here, wherein the preferred philosophy is to find the least complex model that fits the data. In this case, we estimate a heat flow function (of time) which provides an adequate fit to the available thermal indicator calibration data. The function is also constrained to be smooth, in either a first or second derivative sense. Extra complexity or structure is introduced into the function only where required to fit the data and the regularization stabilizes the inversion. The general formulation is presented and a real data example from the North Slope, Alaska is discussed. (author)

  18. Mathematical modelling: a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B.

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  19. Mathematical modelling : a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, H; Hellriegel, B

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  20. Mathematical modelling: a tool for hospital infection control.

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  1. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  2. Numerical Tools for the Bayesian Analysis of Stochastic Frontier Models

    NARCIS (Netherlands)

    Osiewalski, J.; Steel, M.F.J.

    1996-01-01

    In this paper we describe the use of modern numerical integration methods for making posterior inferences in composed error stochastic frontier models for panel data or individual cross-sections.Two Monte Carlo methods have been used in practical applications.We survey these two methods in some

  3. Interactive model evaluation tool based on IPython notebook

    Science.gov (United States)

    Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet

    2015-04-01

    In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the

  4. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    OpenAIRE

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes...

  5. Tools for model-independent bounds in direct dark matter searches

    DEFF Research Database (Denmark)

    Cirelli, M.; Del Nobile, E.; Panci, P.

    2013-01-01

    We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei.......We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei....

  6. Road traffic pollution monitoring and modelling tools and the UK national air quality strategy.

    OpenAIRE

    Marsden, G.R.; Bell, M.C.

    2001-01-01

    This paper provides an assessment of the tools required to fulfil the air quality management role now expected of local authorities within the UK. The use of a range of pollution monitoring tools in assessing air quality is discussed and illustrated with evidence from a number of previous studies of urban background and roadside pollution monitoring in Leicester. A number of approaches to pollution modelling currently available for deployment are examined. Subsequently, the modelling and moni...

  7. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  8. The generalized mathematical model of the failure of the cutting tool

    Science.gov (United States)

    Pasko, N. I.; Antsev, A. V.; Antseva, N. V.; Fyodorov, V. P.

    2017-02-01

    We offer a mathematical model which takes into account the following factors: the spread of the cutting properties of the tool, parameters spread of gear blanks and consideration of the factor of a possible fracture of the cutting wedge tool. The reliability function, taking into account the above-mentioned factors, has five parameters for which assessment we propose a method according to our experience. A numerical illustration of the method is shown in the article. We suggest using the model in the optimization mode of the cutting tool preventive measures.

  9. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    Directory of Open Access Journals (Sweden)

    Maike Kathrin Aurich

    2016-08-01

    Full Text Available Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools , we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  10. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models.

    Science.gov (United States)

    Aurich, Maike K; Fleming, Ronan M T; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.

  11. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  12. A NUI Based Multiple Perspective Variability Modelling CASE Tool

    OpenAIRE

    Bashroush, Rabih

    2010-01-01

    With current trends towards moving variability from hardware to \\ud software, and given the increasing desire to postpone design decisions as much \\ud as is economically feasible, managing the variability from requirements \\ud elicitation to implementation is becoming a primary business requirement in the \\ud product line engineering process. One of the main challenges in variability \\ud management is the visualization and management of industry size variability \\ud models. In this demonstrat...

  13. Mathematical modelling: a tool for hospital infection control.

    OpenAIRE

    Grundmann, Hajo; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has emerged that provides solid and testable hypotheses and opens the road to a quantitative assessment of the main obstructions that undermine current efforts to control the spread of health-care-associate...

  14. CRISPR-Cas9: A Revolutionary Tool for Cancer Modelling

    OpenAIRE

    Raul Torres-Ruiz; Sandra Rodriguez-Perales

    2015-01-01

    The cancer-modelling field is now experiencing a conversion with the recent emergence of the RNA-programmable CRISPR-Cas9 system, a flexible methodology to produce essentially any desired modification in the genome. Cancer is a multistep process that involves many genetic mutations and other genome rearrangements. Despite their importance, it is difficult to recapitulate the degree of genetic complexity found in patient tumors. The CRISPR-Cas9 system for genome editing has been proven as a ...

  15. The Visible Signature Modelling and Evaluation ToolBox

    Science.gov (United States)

    2008-12-01

    Orlando, FL, USA, pp. 452– 461. 53. Doll, T. J., Home, R., Cooke, K. J., Wasilewski , A. A., Sheerin, D. T. & Hetzler, M. C. (2003) Human vision simulation...T., McWhorter, S. W., Schmieder, D. E., Hetzler, M. C., Stewart, J. M., Wasilewski , A. A., Owens, W. R., Sheffer, A. D., Galloway, G. l. & Harbert... Wasilewski , A. A. (1995) Sim- ulation of selective attention and training effects in visual search and detection, in Vision models for target

  16. 3D model tools for architecture and archaeology reconstruction

    Science.gov (United States)

    Vlad, Ioan; Herban, Ioan Sorin; Stoian, Mircea; Vilceanu, Clara-Beatrice

    2016-06-01

    The main objective of architectural and patrimonial survey is to provide a precise documentation of the status quo of the surveyed objects (monuments, buildings, archaeological object and sites) for preservation and protection, for scientific studies and restoration purposes, for the presentation to the general public. Cultural heritage documentation includes an interdisciplinary approach having as purpose an overall understanding of the object itself and an integration of the information which characterize it. The accuracy and the precision of the model are directly influenced by the quality of the measurements realized on field and by the quality of the software. The software is in the process of continuous development, which brings many improvements. On the other side, compared to aerial photogrammetry, close range photogrammetry and particularly architectural photogrammetry is not limited to vertical photographs with special cameras. The methodology of terrestrial photogrammetry has changed significantly and various photographic acquisitions are widely in use. In this context, the present paper brings forward a comparative study of TLS (Terrestrial Laser Scanner) and digital photogrammetry for 3D modeling. The authors take into account the accuracy of the 3D models obtained, the overall costs involved for each technology and method and the 4th dimension - time. The paper proves its applicability as photogrammetric technologies are nowadays used at a large scale for obtaining the 3D model of cultural heritage objects, efficacious in their assessment and monitoring, thus contributing to historic conservation. Its importance also lies in highlighting the advantages and disadvantages of each method used - very important issue for both the industrial and scientific segment when facing decisions such as in which technology to invest more research and funds.

  17. Toxicokinetic models and related tools in environmental risk assessment of chemicals.

    Science.gov (United States)

    Grech, Audrey; Brochot, Céline; Dorne, Jean-Lou; Quignot, Nadia; Bois, Frédéric Y; Beaudouin, Rémy

    2017-02-01

    Environmental risk assessment of chemicals for the protection of ecosystems integrity is a key regulatory and scientific research field which is undergoing constant development in modelling approaches and harmonisation with human risk assessment. This review focuses on state-of-the-art toxicokinetic tools and models that have been applied to terrestrial and aquatic species relevant to environmental risk assessment of chemicals. Both empirical and mechanistic toxicokinetic models are discussed using the results of extensive literature searches together with tools and software for their calibration and an overview of applications in environmental risk assessment. These include simple tools such as one-compartment models, multi-compartment models to physiologically-based toxicokinetic (PBTK) models, mostly available for aquatic species such as fish species and a number of chemical classes including plant protection products, metals, persistent organic pollutants, nanoparticles. Data gaps and further research needs are highlighted.

  18. Peer Assessment with Online Tools to Improve Student Modeling

    Science.gov (United States)

    Atkins, Leslie J.

    2012-11-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.

  19. Tool Support for Collaborative Teaching and Learning of Object-Oriented Modelling

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ratzer, Anne Vinter

    2002-01-01

    Modeling is central to doing and learning object-oriented development. We present a new tool, Ideogramic UML, for gesture-based collaborative modeling with the Unified Modeling Language (UML), which can be used to collaboratively teach and learn modeling. Furthermore, we discuss how we have...... effectively used Ideogramic UML to teach object-oriented modeling and the UML to groups of students using the UML for project assignments....

  20. Conceptual Model As The Tool For Managing Bank Services Quality

    Directory of Open Access Journals (Sweden)

    Kornelija Severović

    2009-07-01

    Full Text Available Quality has become basic factor of economic efficiency and basic principle of business activities of successful organizations. Its consequence is revolution on the area of quality that has comprised all kinds of products and services and so the bank services as well. To understand the present and future needs of clients and to know how to fulfill and try to exceed their expectations is the task of each efficient economy. Therefore, the banks in the developed economies try to re-orientate organizationally, technologically and informatically their business activities placing the client in the core of this business activity. Significant indicators of quality services that banks offer is measured by the waiting time of clients for the offer of the desirable service and the number of clients who give up to enter the bank due to the long waiting queues. Dissatisfied client is the worst work result and business activity of banks. Following the stated, the great effort is made to improve service qualities, which means professionalism and communication of personnel with whom the clients come in contact, and giving punctual and clear information and short waiting period of standing in the lines. The aim of this work is to present and describe the functioning of bank system under the conditions of establishing quality in offering services to clients and to recognize basic guidelines for quality increase in the work of sub branches. Since the banking is very dynamic and complex system, the conceptual model is carried out for the purpose of optimization of the stated quality parameters for the bank business activity; this model, in further research, will serve for the development of simulation model.

  1. ENISI SDE: A New Web-Based Tool for Modeling Stochastic Processes.

    Science.gov (United States)

    Mei, Yongguo; Carbo, Adria; Hoops, Stefan; Hontecillas, Raquel; Bassaganya-Riera, Josep

    2015-01-01

    Modeling and simulations approaches have been widely used in computational biology, mathematics, bioinformatics and engineering to represent complex existing knowledge and to effectively generate novel hypotheses. While deterministic modeling strategies are widely used in computational biology, stochastic modeling techniques are not as popular due to a lack of user-friendly tools. This paper presents ENISI SDE, a novel web-based modeling tool with stochastic differential equations. ENISI SDE provides user-friendly web user interfaces to facilitate adoption by immunologists and computational biologists. This work provides three major contributions: (1) discussion of SDE as a generic approach for stochastic modeling in computational biology; (2) development of ENISI SDE, a web-based user-friendly SDE modeling tool that highly resembles regular ODE-based modeling; (3) applying ENISI SDE modeling tool through a use case for studying stochastic sources of cell heterogeneity in the context of CD4+ T cell differentiation. The CD4+ T cell differential ODE model has been published [8] and can be downloaded from biomodels.net. The case study reproduces a biological phenomenon that is not captured by the previously published ODE model and shows the effectiveness of SDE as a stochastic modeling approach in biology in general and immunology in particular and the power of ENISI SDE.

  2. Modelling as an indispensible research tool in the information society.

    Science.gov (United States)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  3. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  4. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  5. Stochastic Modelling as a Tool for Seismic Signals Segmentation

    Directory of Open Access Journals (Sweden)

    Daniel Kucharczyk

    2016-01-01

    Full Text Available In order to model nonstationary real-world processes one can find appropriate theoretical model with properties following the analyzed data. However in this case many trajectories of the analyzed process are required. Alternatively, one can extract parts of the signal that have homogenous structure via segmentation. The proper segmentation can lead to extraction of important features of analyzed phenomena that cannot be described without the segmentation. There is no one universal method that can be applied for all of the phenomena; thus novel methods should be invented for specific cases. They might address specific character of the signal in different domains (time, frequency, time-frequency, etc.. In this paper we propose two novel segmentation methods that take under consideration the stochastic properties of the analyzed signals in time domain. Our research is motivated by the analysis of vibration signals acquired in an underground mine. In such signals we observe seismic events which appear after the mining activity, like blasting, provoked relaxation of rock, and some unexpected events, like natural rock burst. The proposed segmentation procedures allow for extraction of such parts of the analyzed signals which are related to mentioned events.

  6. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  7. Dynamic wind turbine models in power system simulation tool DIgSILENT

    DEFF Research Database (Denmark)

    Hansen, A.D.; Jauch, C.; Sørensen, Poul Ejnar

    2004-01-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create amodel database in different simulation tools....... The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. inductiongenerators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non......-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). Theinitialisation issues on the wind turbine models into the power system simulation are also presented. However, the main attention in this report is drawn to the modelling at the system level of two wind turbine concepts: 1...

  8. Software tool for the prosthetic foot modeling and stiffness optimization.

    Science.gov (United States)

    Strbac, Matija; Popović, Dejan B

    2012-01-01

    We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  9. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  10. Genetic Mouse Models: The Powerful Tools to Study Fat Tissues.

    Science.gov (United States)

    Kong, Xingxing; Williams, Kevin W; Liu, Tiemin

    2017-01-01

    Obesity and Type 2 diabetes (T2D) are associated with a variety of comorbidities that contribute to mortality around the world. Although significant effort has been expended in understanding mechanisms that mitigate the consequences of this epidemic, the field has experienced limited success thus far. The potential ability of brown adipose tissue (BAT) to counteract obesity and metabolic disease in rodents (and potentially in humans) has been a topical realization. Recently, there is also another thermogenic fat cell called beige adipocytes, which are located among white adipocytes and share similar activated responses to cyclic AMP as classical BAT. In this chapter, we review contemporary molecular strategies to investigate the role of adipose tissue depots in metabolism. In particular, we will discuss the generation of adipose tissue-specific knockout and overexpression of target genes in various mouse models. We will also discuss how to use different Cre (cyclization recombination) mouse lines to investigate diverse types of adipocytes.

  11. Software Tool for the Prosthetic Foot Modeling and Stiffness Optimization

    Directory of Open Access Journals (Sweden)

    Matija Štrbac

    2012-01-01

    Full Text Available We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  12. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  13. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Directory of Open Access Journals (Sweden)

    M. N. Jebur

    2014-10-01

    Full Text Available Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler, for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  14. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  15. Process Modeling In Cold Forging Considering The Process-Tool-Machine Interactions

    Science.gov (United States)

    Kroiss, Thomas; Engel, Ulf; Merklein, Marion

    2010-06-01

    In this paper, a methodic approach is presented for the determination and modeling of the axial deflection characteristic for the whole system of stroke-controlled press and tooling system. This is realized by a combination of experiment and FE simulation. The press characteristic is uniquely measured in experiment. The tooling system characteristic is determined in FE simulation to avoid experimental investigations on various tooling systems. The stiffnesses of press and tooling system are combined to a substitute stiffness that is integrated into the FE process simulation as a spring element. Non-linear initial effects of the press are modeled with a constant shift factor. The approach was applied to a full forward extrusion process on a press with C-frame. A comparison between experiments and results of the integrated FE simulation model showed a high accuracy of the FE model. The simulation model with integrated deflection characteristic represents the entire process behavior and can be used for the calculation of a mathematical process model based on variant simulations and response surfaces. In a subsequent optimization step, an adjusted process and tool design can be determined, that compensates the influence of the deflections on the workpiece dimensions leading to high workpiece accuracy. Using knowledge on the process behavior, the required number of variant simulations was reduced.

  16. Universal geometric error modeling of the CNC machine tools based on the screw theory

    Science.gov (United States)

    Tian, Wenjie; He, Baiyan; Huang, Tian

    2011-05-01

    The methods to improve the precision of the CNC (Computerized Numerical Control) machine tools can be classified into two categories: error prevention and error compensation. Error prevention is to improve the precision via high accuracy in manufacturing and assembly. Error compensation is to analyze the source errors that affect on the machining error, to establish the error model and to reach the ideal position and orientation by modifying the trajectory in real time. Error modeling is the key to compensation, so the error modeling method is of great significance. Many researchers have focused on this topic, and proposed many methods, but we can hardly describe the 6-dimensional configuration error of the machine tools. In this paper, the universal geometric error model of CNC machine tools is obtained utilizing screw theory. The 6-dimensional error vector is expressed with a twist, and the error vector transforms between different frames with the adjoint transformation matrix. This model can describe the overall position and orientation errors of the tool relative to the workpiece entirely. It provides the mathematic model for compensation, and also provides a guideline in the manufacture, assembly and precision synthesis of the machine tools.

  17. Tool flank wear model and parametric optimization in end milling of metal matrix composite using carbide tool: Response surface methodology approach

    Directory of Open Access Journals (Sweden)

    R. Arokiadass

    2012-04-01

    Full Text Available Highly automated CNC end milling machines in manufacturing industry requires reliable model for prediction of tool flank wear. This model later can be used to predict the tool flank wear (VBmax according to the process parameters. In this investigation an attempt was made to develop an empirical relationship to predict the tool flank wear (VBmax of carbide tools while machining LM25 Al/SiCp incorporating the process parameters such as spindle speed (N, feed rate (f, depth of cut (d and various % wt. of silicon carbide (S. Response surface methodology (RSM was applied to optimizing the end milling process parameters to attain the minimum tool flank wear. Predicted values obtained from the developed model and experimental results are compared, and error <5 percent is observed. In addition, it is concluded that the flank wear increases with the increase of SiCp percentage weight in the MMC.

  18. NREL Multiphysics Modeling Tools and ISC Device for Designing Safer Li-Ion Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, Ahmad A.; Yang, Chuanbo

    2016-03-24

    The National Renewable Energy Laboratory has developed a portfolio of multiphysics modeling tools to aid battery designers better understand the response of lithium ion batteries to abusive conditions. We will discuss this portfolio, which includes coupled electrical, thermal, chemical, electrochemical, and mechanical modeling. These models can simulate the response of a cell to overheating, overcharge, mechanical deformation, nail penetration, and internal short circuit. Cell-to-cell thermal propagation modeling will be discussed.

  19. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    Science.gov (United States)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  20. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Wenjie Tian

    2014-01-01

    Full Text Available Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy improvement by suitable measures, that is, component tolerancing in design, manufacturing, and assembly processes, and error compensation. The sensitivity analysis method is proposed, and the sensitivities of compensable and uncompensable pose accuracies are analyzed. The analysis results will be used for the precision design of the machine tool.

  1. Force sensor based tool condition monitoring using a heterogeneous ensemble learning model.

    Science.gov (United States)

    Wang, Guofeng; Yang, Yinwei; Li, Zhimeng

    2014-11-14

    Tool condition monitoring (TCM) plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM), hidden Markov model (HMM) and radius basis function (RBF) are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR) algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.

  2. Numerical modeling of friction stir welding using the tools with polygonal pins

    Directory of Open Access Journals (Sweden)

    M. Mehta

    2015-09-01

    Full Text Available Friction stir welding using the tools with polygonal pins is often found to improve the mechanical strength of weld joint in comparison to the tools with circular pins. However, the impacts of pin profile on the peak temperature, tool torque and traverse force, and the resultant mechanical stresses experienced by the tool have been rarely reported in a systematic manner. An estimation of the rate of heat generation for the tools with polygonal pins is challenging due to their non-axisymmetric cross-section about the tool axis. A novel methodology is presented to analytically estimate the rate of heat generation for the tools with polygonal pins. A three-dimensional heat transfer analysis of friction stir welding is carried out using finite element method. The computed temperature field from the heat transfer model is used to estimate the torque, traverse force and the mechanical stresses experienced by regular triangular, square, pentagon and hexagon pins following the principles of solid mechanics. The computed results show that the peak temperature experienced by the tool pin increases with the number of pin sides. However, the resultant maximum shear stress experienced by the pin reduces from the triangular to hexagonal pins.

  3. LCA of waste management systems: Development of tools for modeling and uncertainty analysis

    DEFF Research Database (Denmark)

    Clavreul, Julie

    to be modelled rather than monitored as in classical LCA (e.g. landfilling or the application of processed waste on agricultural land). Therefore LCA-tools are needed which specifically address these issues and enable practitioners to model properly their systems. In this thesis several pieces of work...... are presented. First a review was carried out on all LCA studies of waste management systems published before mid-2012. This provided a global overview of the technologies and waste fractions which have attracted focus within LCA while enabling an analysis of methodological tendencies, the use of tools...... and databases and the application of uncertainty analysis methods. The major outcome of this thesis was the development of a new LCA model, called EASETECH, building on the experience with previous LCA-tools, in particular the EASEWASTE model. Before the actual implementation phase, a design phase involved...

  4. Modelling of the Contact Condition at the Tool/Matrix Interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper; Wert, John

    2003-01-01

    generation is closely related to the friction condition at the contact interface between the FSW tool and the weld piece material as well as the material flow in the weld matrix, since the mechanisms for heat generation by frictional and plastic dissipation are different. The heat generation from the tool...... a known contact condition at the contact interface, e.g. either as pure sliding or sticking. The present model uses Coulomb’s law of friction for the sliding condition and the material yield shear stress for the sticking condition to model the contact forces. The model includes heat generation...

  5. Video Analysis and Modeling Tool for Physics Education: A workshop for Redesigning Pedagogy

    CERN Document Server

    Wee, Loo Kang

    2012-01-01

    This workshop aims to demonstrate how the Tracker Video Analysis and Modeling Tool engages, enables and empowers teachers to be learners so that we can be leaders in our teaching practice. Through this workshop, the kinematics of a falling ball and a projectile motion are explored using video analysis and in the later video modeling. We hope to lead and inspire other teachers by facilitating their experiences with this ICT-enabled video modeling pedagogy (Brown, 2008) and free tool for facilitating students-centered active learning, thus motivate students to be more self-directed.

  6. Development of a visualization tool for integrated surface water-groundwater modeling

    Science.gov (United States)

    Tian, Yong; Zheng, Yi; Zheng, Chunmiao

    2016-01-01

    Physically-based, fully integrated surface water (SW)-groundwater (GW) models have been increasingly used in water resources research and management. The integrated modeling involves a large amount of scientific data. The use of three-dimensional (3D) visualization software to integrate all the scientific data into a comprehensive system can facilitate the interpretation and validation of modeling results. Nevertheless, at present few software tools can efficiently perform data visualization for integrated SW-GW modeling. In this study, a visualization tool named IHM3D was designed and developed specifically for integrated SW-GW modeling. In IHM3D, spatially distributed model inputs/outputs and geo-referenced data sets are visualized in a virtual globe-based 3D environment. End users can conveniently explore and validate modeling results within the 3D environment. A GSLFOW (an integrated SW-GW model developed by USGS) modeling case in the Heihe River Basin (Northwest China) was used to demonstrate the applicability of IHM3D at a large basin scale. The visualization of the modeling results significantly improved the understanding of the complex hydrologic cycle in this water-limited area, and provided insights into the regional water resources management. This study shows that visualization tools like IHM3D can promote data and model sharing in the water resources research community, and make it more practical to perform complex hydrological modeling in real-world water resources management.

  7. Mathematical Practices in the Sciences: The Potential of Computers as a Modelling Tool.

    Science.gov (United States)

    Molyneux-Hodgson, Susan; Mochon, Simon

    This paper is concerned with the role of spreadsheets as a tool for the development of mathematical models in science, one aspect of a collaborative project which worked with two groups of pre-university students from Mexico and the United Kingdom. The purpose of the modeling activities designed was to engage students in creating an…

  8. The 3 "C" Design Model for Networked Collaborative E-Learning: A Tool for Novice Designers

    Science.gov (United States)

    Bird, Len

    2007-01-01

    This paper outlines a model for online course design aimed at the mainstream majority of university academics rather than at the early adopters of technology. It has been developed from work at Coventry Business School where tutors have been called upon to design online modules for the first time. Like many good tools, the model's key strength is…

  9. PetriCode: A Tool for Template-Based Code Generation from CPN Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Code generation is an important part of model driven methodologies. In this paper, we present PetriCode, a software tool for generating protocol software from a subclass of Coloured Petri Nets (CPNs). The CPN subclass is comprised of hierarchical CPN models describing a protocol system at different...

  10. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  11. A new validation-assessment tool for health-economic decision models

    NARCIS (Netherlands)

    Mauskopf, J.; Vemer, P.; Voorn, van G.A.K.; Corro Ramos, I.

    2014-01-01

    A validation-assessment tool is being developed for decision makers to transparently and consistently evaluate the validation status of different health-economic decision models. It is designed as a list of validation techniques covering all relevant aspects of model validation to be filled in by

  12. JTorX: A Tool for On-Line Model-Driven Test Derivation and Execution

    NARCIS (Netherlands)

    Belinfante, Axel; Esparza, Javier; Majumdar, Rupak

    We introduce JTorX, a tool for model-driven test derivation and execution, based on the ioco theory. This theory, originally presented in [Tretmans,1996], has been refined in [Tretmans,2008] with test-cases that are input-enabled. For models with underspecified traces [vdBijl+,2004] introduced

  13. Modelling and Optimization of Technological Process for Magnetron Synthesis of Altin Nanocomposite Films on Cutting Tools

    Science.gov (United States)

    Kozhina, T. D.

    2016-04-01

    The paper highlights the results of the research on developing the mechanism to model the technological process for magnetron synthesis of nanocomposite films on cutting tools, which provides their specified physical and mechanical characteristics by controlling pulsed plasma parameters. The paper presents optimal conditions for AlTiN coating deposition on cutting tools according to the ion energy of sputtered atoms in order to provide their specified physical and mechanical characteristics.

  14. Model-based calculating tool for pollen-mediated gene flow frequencies in plants.

    Science.gov (United States)

    Lei, Wang; Bao-Rong, Lu

    2016-12-30

    The potential social-economic and environmental impacts caused by transgene flow from genetically engineered (GE) crops have stimulated worldwide biosafety concerns. To determine transgene flow frequencies resulted from pollination is the first critical step for assessing such impacts, in addition to the determination of transgene expression and fitness in crop-wild hybrid descendants. Two methods are commonly used to estimate pollen-mediated gene flow (PMGF) frequencies: field experimenting and mathematical modeling. Field experiments can provide relatively accurate results but are time/resource consuming. Modeling offers an effective complement for PMGF experimental assessment. However, many published models describe PMGF by mathematical equations and are practically not easy to use. To increase the application of PMGF modeling for the estimation of transgene flow, we established a tool to calculate PMGF frequencies based on a quasi-mechanistic PMGF model for wind-pollination species. This tool includes a calculating program displayed by an easy-operating interface. PMGF frequencies of different plant species can be quickly calculated under different environmental conditions by including a number of biological and wind speed parameters that can be measured in the fields/laboratories or obtained from published data. The tool is freely available in the public domain (http://ecology.fudan.edu.cn/userfiles/cn/files/Tool_Manual.zip). Case studies including rice, wheat, and maize demonstrated similar results between the calculated frequencies based on this tool and those from published PMGF data. This PMGF calculating tool will provide useful information for assessing and monitoring social-economic and environmental impacts caused by transgene flow from GE crops. This tool can also be applied to determine the isolation distances between GE and non-GE crops in a coexistence agro-ecosystem, and to ensure the purity of certified seeds by setting proper isolation distances

  15. TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.

    Science.gov (United States)

    Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David

    2016-10-01

    Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH).

  16. Modelling thermomechanical conditions at the tool/matrix interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper

    2004-01-01

    In friction stir welding the material flow is among others controlled by the contact condition at the tool interface, the thermomechanical state of the matrix and the welding parameters. The conditions under which the deposition process is successful are not fully understood and in most models...... presented previously in literature, the modelling of the material flow at the tool interface has been prescribed as boundary conditions, i.e. the material is forced to keep contact with the tool. The objective of the present work is to analyse the thermomechanical conditions under which a consolidated weld...... frictional and plastic dissipation. Of special interest is the contact condition along the shoulder/matrix and probe/matrix interfaces, as especially the latter affects the efficiency of the deposition process. The thermo-mechanical state in the workpiece is established by modelling both the dwell and weld...

  17. Modelling thermomechanical conditions at the tool/matrix interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper

    2004-01-01

    In friction stir welding the material flow is among others controlled by the contact condition at the tool interface, the thermomechanical state of the matrix and the welding parameters. The conditions under which the deposition process is successful are not fully understood and in most models...... frictional and plastic dissipation. Of special interest is the contact condition along the shoulder/matrix and probe/matrix interfaces, as especially the latter affects the efficiency of the deposition process. The thermo-mechanical state in the workpiece is established by modelling both the dwell and weld...... presented previously in literature, the modelling of the material flow at the tool interface has been prescribed as boundary conditions, i.e. the material is forced to keep contact with the tool. The objective of the present work is to analyse the thermomechanical conditions under which a consolidated weld...

  18. Development of the software generation method using model driven software engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Jang, H. S.; Jeong, J. C.; Kim, J. H.; Han, H. W.; Kim, D. Y.; Jang, Y. W. [KOPEC, Taejon (Korea, Republic of); Moon, W. S. [NEXTech Inc., Seoul (Korea, Republic of)

    2003-10-01

    The methodologies to generate the automated software design specification and source code for the nuclear I and C systems software using model driven language is developed in this work. For qualitative analysis of the algorithm, the activity diagram is modeled and generated using Unified Modeling Language (UML), and then the sequence diagram is designed for automated source code generation. For validation of the generated code, the code audits and module test is performed using Test and QA tool. The code coverage and complexities of example code are examined in this stage. The low pressure pressurizer reactor trip module of the Plant Protection System was programmed as subject for this task. The test result using the test tool shows that errors were easily detected from the generated source codes that have been generated using test tool. The accuracy of input/output processing by the execution modules was clearly identified.

  19. Tools and Products of Real-Time Modeling: Opportunities for Space Weather Forecasting

    Science.gov (United States)

    Hesse, Michael

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of draft Space Weather forecasting tools. This presentation will focus on the last element. Specifically, we will discuss present capabilities, and the potential to derive further tools. These capabilities will be interpreted in the context of a broad-based, bootstrapping activity for modern Space Weather forecasting.

  20. Optimal Vehicle Design Using the Integrated System and Cost Modeling Tool Suite

    Science.gov (United States)

    2010-08-01

    Space Vehicle Costing ( ACEIT ) • New Small Sat Model Development & Production Cost O&M Cost Module  Radiation Exposure  Radiation Detector Response...Reliability OML Availability Risk l l Tools CEA, SRM Model, POST, ACEIT , Inflation Model, Rotor Blade Des, Microsoft Project, ATSV, S/1-iABP...space STK, SOAP – Specific mission • Space Vehicle Design (SMAD) • Space Vehicle Propulsion • Orbit Propagation • Space Vehicle Costing ( ACEIT ) • New

  1. MODELING AND COMPENSATION TECHNIQUE FOR THE GEOMETRIC ERRORS OF FIVE-AXIS CNC MACHINE TOOLS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    One of the important trends in precision machining is the development of real-time error compensation technique.The error compensation for multi-axis CNC machine tools is very difficult and attractive.The modeling for the geometric error of five-axis CNC machine tools based on multi-body systems is proposed.And the key technique of the compensation-identifying geometric error parameters-is developed.The simulation of cutting workpiece to verify the modeling based on the multi-body systems is also considered.

  2. FLOW STRESS MODEL FOR HARD MACHINING OF AISI H13 WORK TOOL STEEL

    Institute of Scientific and Technical Information of China (English)

    H. Yan; J. Hua; R. Shivpuri

    2005-01-01

    An approach is presented to characterize the stress response of workpiece in hard machining,accounted for the effect of the initial workpiece hardness, temperature, strain and strain rate on flow stress. AISI H13 work tool steel was chosen to verify this methodology. The proposed flow stress model demonstrates a good agreement with data collected from published experiments.Therefore, the proposed model can be used to predict the corresponding flow stress-strain response of AISI H13 work tool steel with variation of the initial workpiece hardness in hard machining.

  3. An axisymmetrical non-linear finite element model for induction heating in injection molding tools

    DEFF Research Database (Denmark)

    Guerrier, Patrick; Nielsen, Kaspar Kirstein; Menotti, Stefano;

    2016-01-01

    To analyze the heating and cooling phase of an induction heated injection molding tool accurately, the temperature dependent magnetic properties, namely the non-linear B-H curves, need to be accounted for in an induction heating simulation. Hence, a finite element model has been developed...... in to the injection molding tool. The model shows very good agreement with the experimental temperature measurements. It is also shown that the non-linearity can be used without the temperature dependency in some cases, and a proposed method is presented of how to estimate an effective linear permeability to use...

  4. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  5. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  6. Bio-Logic Builder: A Non-Technical Tool for Building Dynamical, Qualitative Models

    Science.gov (United States)

    Helikar, Tomáš; Kowal, Bryan; Madrahimov, Alex; Shrestha, Manish; Pedersen, Jay; Limbu, Kahani; Thapa, Ishwor; Rowley, Thaine; Satalkar, Rahul; Kochi, Naomi; Konvalina, John; Rogers, Jim A.

    2012-01-01

    Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism) of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized “bio-logic” modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool. PMID:23082121

  7. Bio-logic builder: a non-technical tool for building dynamical, qualitative models.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    Full Text Available Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized "bio-logic" modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool.

  8. Swine models, genomic tools and services to enhance our understanding of human health and diseases.

    Science.gov (United States)

    Walters, Eric M; Wells, Kevin D; Bryda, Elizabeth C; Schommer, Susan; Prather, Randall S

    2017-03-22

    The pig is becoming increasingly important as a biomedical model. Given the similarities between pigs and humans, a greater understanding of the underlying biology of human health and diseases may come from the pig rather than from classical rodent models. With an increasing need for swine models, it is essential that the genomic tools, models and services be readily available to the scientific community. Many of these are available through the National Swine Resource and Research Center (NSRRC), a facility funded by the US National Institutes of Health at the University of Missouri. The goal of the NSRRC is to provide high-quality biomedical swine models to the scientific community.

  9. Thermal Error Modeling of a Machine Tool Using Data Mining Scheme

    Science.gov (United States)

    Wang, Kun-Chieh; Tseng, Pai-Chang

    In this paper the knowledge discovery technique is used to build an effective and transparent mathematic thermal error model for machine tools. Our proposed thermal error modeling methodology (called KRL) integrates the schemes of K-means theory (KM), rough-set theory (RS), and linear regression model (LR). First, to explore the machine tool's thermal behavior, an integrated system is designed to simultaneously measure the temperature ascents at selected characteristic points and the thermal deformations at spindle nose under suitable real machining conditions. Second, the obtained data are classified by the KM method, further reduced by the RS scheme, and a linear thermal error model is established by the LR technique. To evaluate the performance of our proposed model, an adaptive neural fuzzy inference system (ANFIS) thermal error model is introduced for comparison. Finally, a verification experiment is carried out and results reveal that the proposed KRL model is effective in predicting thermal behavior in machine tools. Our proposed KRL model is transparent, easily understood by users, and can be easily programmed or modified for different machining conditions.

  10. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Borum, Robert C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chaleff, Ethan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogerson, Doug W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Batteh, John J. [Modelon Corporation (Sweden); Tiller, Michael M. [Xogeny Corporation, Canton, MI (United States)

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  11. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2009-01-01

    Simulation tools are becoming available which predict the heat and moisture conditions in the indoor environment as well as in the envelope of buildings, and thus it has become possible to consider the important interaction between the different components of buildings and the different physical...... phenomena which occur. However, there is still room for further development of such tools. This paper will present an attempt to integrate modelling of air flows in building envelopes into a whole building hygrothermal simulation tool. Two kinds of air flows have been considered: 1. Air flow in ventilated...... cavity such as in the exterior cladding of building envelopes, i.e. a flow which is parallel to the construction plane. 2. Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the construction plane. The new models make it possible to predict the thermal...

  12. Modeling Tool for Decision Support during Early Days of an Anthrax Event

    Science.gov (United States)

    Meltzer, Martin I.; Shadomy, Sean; Bower, William A.; Hupert, Nathaniel

    2017-01-01

    Health officials lack field-implementable tools for forecasting the effects that a large-scale release of Bacillus anthracis spores would have on public health and hospitals. We created a modeling tool (combining inhalational anthrax caseload projections based on initial case reports, effects of variable postexposure prophylaxis campaigns, and healthcare facility surge capacity requirements) to project hospitalizations and casualties from a newly detected inhalation anthrax event, and we examined the consequences of intervention choices. With only 3 days of case counts, the model can predict final attack sizes for simulated Sverdlovsk-like events (1979 USSR) with sufficient accuracy for decision making and confirms the value of early postexposure prophylaxis initiation. According to a baseline scenario, hospital treatment volume peaks 15 days after exposure, deaths peak earlier (day 5), and recovery peaks later (day 23). This tool gives public health, hospital, and emergency planners scenario-specific information for developing quantitative response plans for this threat. PMID:27983505

  13. Multiscale Multiphysics-Based Modeling and Analysis on the Tool Wear in Micro Drilling

    Science.gov (United States)

    Niu, Zhichao; Cheng, Kai

    2016-02-01

    In micro-cutting processes, process variables including cutting force, cutting temperature and drill-workpiece interfacing conditions (lubrication and interaction, etc.) significantly affect the tool wear in a dynamic interactive in-process manner. The resultant tool life and cutting performance directly affect the component surface roughness, material removal rate and form accuracy control, etc. In this paper, a multiscale multiphysics oriented approach to modeling and analysis is presented particularly on tooling performance in micro drilling processes. The process optimization is also taken account based on establishing the intrinsic relationship between process parameters and cutting performance. The modeling and analysis are evaluated and validated through well-designed machining trials, and further supported by metrology measurements and simulations. The paper is concluded with a further discussion on the potential and application of the approach for broad micro manufacturing purposes.

  14. Development of the ECLSS Sizing Analysis Tool and ARS Mass Balance Model Using Microsoft Excel

    Science.gov (United States)

    McGlothlin, E. P.; Yeh, H. Y.; Lin, C. H.

    1999-01-01

    The development of a Microsoft Excel-compatible Environmental Control and Life Support System (ECLSS) sizing analysis "tool" for conceptual design of Mars human exploration missions makes it possible for a user to choose a certain technology in the corresponding subsystem. This tool estimates the mass, volume, and power requirements of every technology in a subsystem and the system as a whole. Furthermore, to verify that a design sized by the ECLSS Sizing Tool meets the mission requirements and integrates properly, mass balance models that solve for component throughputs of such ECLSS systems as the Water Recovery System (WRS) and Air Revitalization System (ARS) must be developed. The ARS Mass Balance Model will be discussed in this paper.

  15. Modelling of the Contact Condition at the Tool/Matrix Interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper; Wert, John

    2003-01-01

    The objective of the present paper is to investigate the heat generation and contact condition during Friction Stir Welding (FSW). For this purpose, an analytical model is developed for the heat generation and this is combined with a Eulerian FE-analysis of the temperature field. The heat...... generation is closely related to the friction condition at the contact interface between the FSW tool and the weld piece material as well as the material flow in the weld matrix, since the mechanisms for heat generation by frictional and plastic dissipation are different. The heat generation from the tool...... is governed by the contact condition, i.e. whether there is sliding, sticking or partial sliding/sticking. The contact condition in FSW is complex (dependent on alloy, welding parameters, tool design etc.), and previous models (both analytical and numerical) for simulation of the heat generation assume...

  16. Data Provenance as a Tool for Debugging Hydrological Models based on Python

    Science.gov (United States)

    Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.

    2012-12-01

    There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R

  17. Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool

    Institute of Scientific and Technical Information of China (English)

    Qianjian GUO; Shuo FAN; Rufeng XU; Xiang CHENG; Guoyong ZHAO; Jianguo YANG

    2017-01-01

    Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools,spindle thermal error measurement,modeling and compensation of a two turntable five-axis machine tool are researched.Measurement experiment of heat sources and thermal errors are carried out,and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling.In order to analyze the influence of different heat sources on spindle thermal errors,an ANN (artificial neural network) model is presented,and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN,a new ABCNN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors.In order to test the prediction performance of ABC-NN model,an experiment system is developed,the prediction results of LSR (least squares regression),ANN and ABC-NN are compared with the measurement results of spindle thermal errors.Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN,and the residual error is smaller than 3 μm,the new modeling method is feasible.The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.

  18. Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool

    Science.gov (United States)

    Guo, Qianjian; Fan, Shuo; Xu, Rufeng; Cheng, Xiang; Zhao, Guoyong; Yang, Jianguo

    2017-03-01

    Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC-NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 μm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.

  19. Digital soil mapping as a tool for quantifying state-and-transition models

    Science.gov (United States)

    Ecological sites and associated state-and-transition models (STMs) are rapidly becoming important land management tools in rangeland systems in the US and around the world. Descriptions of states and transitions are largely developed from expert knowledge and generally accepted species and community...

  20. Towards Semantically Integrated Models and Tools for Cyber-Physical Systems Design

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Fitzgerald, John; Woodcock, Jim

    2016-01-01

    We describe an approach to the model-based engineering of embedded and cyber-physical systems, based on the semantic integration of diverse discipline-specific notations and tools. Using the example of a small unmanned aerial vehicle, we explain the need for multiple notations and collaborative...

  1. Validation of Multiple Tools for Flat Plate Photovoltaic Modeling Against Measured Data

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Blair, N.; Dobos, A. P.

    2014-08-01

    This report expands upon a previous work by the same authors, published in the 40th IEEE Photovoltaic Specialists conference. In this validation study, comprehensive analysis is performed on nine photovoltaic systems for which NREL could obtain detailed performance data and specifications, including three utility-scale systems and six commercial scale systems. Multiple photovoltaic performance modeling tools were used to model these nine systems, and the error of each tool was analyzed compared to quality-controlled measured performance data. This study shows that, excluding identified outliers, all tools achieve annual errors within +/-8% and hourly root mean squared errors less than 7% for all systems. It is further shown using SAM that module model and irradiance input choices can change the annual error with respect to measured data by as much as 6.6% for these nine systems, although all combinations examined still fall within an annual error range of +/-8.5%. Additionally, a seasonal variation in monthly error is shown for all tools. Finally, the effects of irradiance data uncertainty and the use of default loss assumptions on annual error are explored, and two approaches to reduce the error inherent in photovoltaic modeling are proposed.

  2. Recommender System and Web 2.0 Tools to Enhance a Blended Learning Model

    Science.gov (United States)

    Hoic-Bozic, Natasa; Dlab, Martina Holenko; Mornar, Vedran

    2016-01-01

    Blended learning models that combine face-to-face and online learning are of great importance in modern higher education. However, their development should be in line with the recent changes in e-learning that emphasize a student-centered approach and use tools available on the Web to support the learning process. This paper presents research on…

  3. New tools in modulating Maillard reaction from model systems to food

    NARCIS (Netherlands)

    Troise, A.D.

    2015-01-01

    New tools in modulating Maillard reaction from model systems to food
    The Maillard reaction (MR) supervises the final quality of foods and occupies a prominent place in food science. The first stable compounds, the Amadori rearrangement products (

  4. Simulation of Forming Process as an Educational Tool Using Physical Modeling

    Science.gov (United States)

    Abdullah, A. B.; Muda, M. R.; Samad, Z.

    2008-01-01

    Metal forming process simulation requires a very high cost including the cost for dies, machine and material and tight process control since the process involve very huge pressure. A physical modeling technique is developed and initiates a new era of educational tool of simulating the process effectively. Several publications and findings have…

  5. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    OpenAIRE

    R. Fernandes; F. Braunschweig; Lourenço, F.; R. Neves

    2015-01-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable shoreline risk levels from ships has b...

  6. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    OpenAIRE

    R. Fernandes; F. Braunschweig; Lourenço, F.; R. Neves

    2016-01-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable individual vessel accident risk levels and shoreline cont...

  7. Facebook and the Cognitive Model: A Tool for Promoting Adolescent Self-Awareness

    Science.gov (United States)

    Lewis, Lucy; Wahesh, Edward

    2012-01-01

    A homework activity incorporating the social networking site Facebook is presented as a tool for teaching adolescent clients about the cognitive model and increasing their ability to identify and modify problematic thinking. The authors describe how a worksheet developed to help clients examine information presented on their Facebook profile can…

  8. New tools in modulating Maillard reaction from model systems to food

    NARCIS (Netherlands)

    Troise, A.D.

    2015-01-01

    New tools in modulating Maillard reaction from model systems to food
    The Maillard reaction (MR) supervises the final quality of foods and occupies a prominent place in food science. The first stable compounds, the Amadori rearrangement products (

  9. Facebook and the Cognitive Model: A Tool for Promoting Adolescent Self-Awareness

    Science.gov (United States)

    Lewis, Lucy; Wahesh, Edward

    2012-01-01

    A homework activity incorporating the social networking site Facebook is presented as a tool for teaching adolescent clients about the cognitive model and increasing their ability to identify and modify problematic thinking. The authors describe how a worksheet developed to help clients examine information presented on their Facebook profile can…

  10. Realistic tool-tissue interaction models for surgical simulation and planning

    NARCIS (Netherlands)

    Misra, Sarthak

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in pre- and intra-operative surgical planning. Realistic modeling of medical interventions involving tool-tissue interactions has been considered to be a key requirement in the development

  11. PLASMA PROTEIN PROFILING AS A HIGH THROUGHPUT TOOL FOR CHEMICAL SCREENING USING A SMALL FISH MODEL

    Science.gov (United States)

    Hudson, R. Tod, Michael J. Hemmer, Kimberly A. Salinas, Sherry S. Wilkinson, James Watts, James T. Winstead, Peggy S. Harris, Amy Kirkpatrick and Calvin C. Walker. In press. Plasma Protein Profiling as a High Throughput Tool for Chemical Screening Using a Small Fish Model (Abstra...

  12. TENCompetence Assessment Model and Related Tools for Non Traditional Methods of Assessment

    NARCIS (Netherlands)

    Petrov, Milen; Aleksieva-Petrova, Adelina; Stefanov, Krassen; Schoonenboom, Judith; Miao, Yongwu

    2008-01-01

    Petrov, M., Aleksieva-Petrova, A., Stefanov, K., Schoonenboom, J., & Miao, Y. (2008). TENCompetence Assessment Model and Related Tools for Non Traditional Methods of Assessment. In H. W. Sligte & R. Koper (Eds). Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for Lifelong Com

  13. GAMBIT: The Global and Modular Beyond-the-Standard-Model Inference Tool arXiv

    CERN Document Server

    Athron, Peter; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  14. Modeling movements of a long hand-held tool with effects of moments of inertia.

    Science.gov (United States)

    Lin, Chiuhsiang Joe; Chen, Hung-Jen

    2014-04-01

    The current experiment aimed to investigate the effects of weight position on movement time in target acquisition tasks. Subsequently, a simple mathematical model was developed to describe the movement time with the moments of inertia. Ten right-handed participants conducted continuous Fitts pointing tasks using a laparoscopic instrument as a long hand-held tool. The results showed significant effects of weight position on movement time. Furthermore, an extended Fitts' law model is proposed for the moments of inertia produced by the hand, instrument, and a constant mass in different positions. This predictive model accounted for 63% of the variance in movement time. The predictive model proposed in the present study can be applied not only to estimate movement time given a particular target width, instrument movement amplitude, and weight position of a long hand-held tool but also to standardize movement time and establish training standards.

  15. ESMValTool (v1.0) - a community diagnostic and performance metrics tool for routine evaluation of Earth system models in CMIP

    Science.gov (United States)

    Eyring, Veronika; Righi, Mattia; Lauer, Axel; Evaldsson, Martin; Wenzel, Sabrina; Jones, Colin; Anav, Alessandro; Andrews, Oliver; Cionni, Irene; Davin, Edouard L.; Deser, Clara; Ehbrecht, Carsten; Friedlingstein, Pierre; Gleckler, Peter; Gottschaldt, Klaus-Dirk; Hagemann, Stefan; Juckes, Martin; Kindermann, Stephan; Krasting, John; Kunert, Dominik; Levine, Richard; Loew, Alexander; Mäkelä, Jarmo; Martin, Gill; Mason, Erik; Phillips, Adam S.; Read, Simon; Rio, Catherine; Roehrig, Romain; Senftleben, Daniel; Sterl, Andreas; van Ulft, Lambertus H.; Walton, Jeremy; Wang, Shiyu; Williams, Keith D.

    2016-05-01

    A community diagnostics and performance metrics tool for the evaluation of Earth system models (ESMs) has been developed that allows for routine comparison of single or multiple models, either against predecessor versions or against observations. The priority of the effort so far has been to target specific scientific themes focusing on selected essential climate variables (ECVs), a range of known systematic biases common to ESMs, such as coupled tropical climate variability, monsoons, Southern Ocean processes, continental dry biases, and soil hydrology-climate interactions, as well as atmospheric CO2 budgets, tropospheric and stratospheric ozone, and tropospheric aerosols. The tool is being developed in such a way that additional analyses can easily be added. A set of standard namelists for each scientific topic reproduces specific sets of diagnostics or performance metrics that have demonstrated their importance in ESM evaluation in the peer-reviewed literature. The Earth System Model Evaluation Tool (ESMValTool) is a community effort open to both users and developers encouraging open exchange of diagnostic source code and evaluation results from the Coupled Model Intercomparison Project (CMIP) ensemble. This will facilitate and improve ESM evaluation beyond the state-of-the-art and aims at supporting such activities within CMIP and at individual modelling centres. Ultimately, we envisage running the ESMValTool alongside the Earth System Grid Federation (ESGF) as part of a more routine evaluation of CMIP model simulations while utilizing observations available in standard formats (obs4MIPs) or provided by the user.

  16. ESMValTool (v1.0 – a community diagnostic and performance metrics tool for routine evaluation of Earth System Models in CMIP

    Directory of Open Access Journals (Sweden)

    V. Eyring

    2015-09-01

    Full Text Available A community diagnostics and performance metrics tool for the evaluation of Earth System Models (ESMs has been developed that allows for routine comparison of single or multiple models, either against predecessor versions or against observations. The priority of the effort so far has been to target specific scientific themes focusing on selected Essential Climate Variables (ECVs, a range of known systematic biases common to ESMs, such as coupled tropical climate variability, monsoons, Southern Ocean processes, continental dry biases and soil hydrology-climate interactions, as well as atmospheric CO2 budgets, tropospheric and stratospheric ozone, and tropospheric aerosols. The tool is being developed in such a way that additional analyses can easily be added. A set of standard namelists for each scientific topic reproduces specific sets of diagnostics or performance metrics that have demonstrated their importance in ESM evaluation in the peer-reviewed literature. The Earth System Model Evaluation Tool (ESMValTool is a community effort open to both users and developers encouraging open exchange of diagnostic source code and evaluation results from the CMIP ensemble. This will facilitate and improve ESM evaluation beyond the state-of-the-art and aims at supporting such activities within the Coupled Model Intercomparison Project (CMIP and at individual modelling centres. Ultimately, we envisage running the ESMValTool alongside the Earth System Grid Federation (ESGF as part of a more routine evaluation of CMIP model simulations while utilizing observations available in standard formats (obs4MIPs or provided by the user.

  17. Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method

    Science.gov (United States)

    Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín

    2013-09-01

    Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.

  18. Thermal modelling of cooling tool cutting when milling by electrical analogy

    Directory of Open Access Journals (Sweden)

    Benmoussa H.

    2010-06-01

    Full Text Available Measurement temperatures by (some devises are applied immediately after shut-down and may be corrected for the temperature drop that occurs in the interval between shut-down and measurement. This paper presents a new procedure for thermal modelling of the tool cutting used just after machining; when the tool is out off the chip in order to extrapolate the cutting temperature from the temperature measured when the tool is at stand still. A fin approximation is made in enhancing heat loss (by conduction and convection to air stream is used. In the modelling we introduce an equivalent thermal network to estimate the cutting temperature as a function of specific energy. In another hand, a local modified element lumped conduction equation is used to predict the temperature gradient with time when the tool is being cooled, with initial and boundary conditions. These predictions provide a detailed view of the global heat transfer coefficient as a function of cutting speed because the heat loss for the tool in air stream is an order of magnitude larger than in normal environment. Finally we deduct the cutting temperature by inverse method.

  19. Operation reliability assessment for cutting tools by applying a proportional covariate model to condition monitoring information.

    Science.gov (United States)

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-09-25

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.

  20. Thermal modelling of cooling tool cutting when milling by electrical analogy

    Science.gov (United States)

    Benabid, F.; Arrouf, M.; Assas, M.; Benmoussa, H.

    2010-06-01

    Measurement temperatures by (some devises) are applied immediately after shut-down and may be corrected for the temperature drop that occurs in the interval between shut-down and measurement. This paper presents a new procedure for thermal modelling of the tool cutting used just after machining; when the tool is out off the chip in order to extrapolate the cutting temperature from the temperature measured when the tool is at stand still. A fin approximation is made in enhancing heat loss (by conduction and convection) to air stream is used. In the modelling we introduce an equivalent thermal network to estimate the cutting temperature as a function of specific energy. In another hand, a local modified element lumped conduction equation is used to predict the temperature gradient with time when the tool is being cooled, with initial and boundary conditions. These predictions provide a detailed view of the global heat transfer coefficient as a function of cutting speed because the heat loss for the tool in air stream is an order of magnitude larger than in normal environment. Finally we deduct the cutting temperature by inverse method.

  1. Coloured Petri Nets and CPN Tools for Modelling and Validation of Concurrent Systems

    DEFF Research Database (Denmark)

    Jensen, Kurt; Kristensen, Lars Michael; Wells, Lisa Marie

    2007-01-01

    Coloured Petri Nets (CPNs) is a language for the modeling and validation og systems in which concurrency, communication, and synchronisation play a major role. Coloured Petri Nets is a descrete-event modeling language combining Petri Nets with the funcitonal programming language Standard ML. Petr...... with user-defined Standard ML functions. A license for CPN Tools can be obtained free of charge, also for commercial use....

  2. Poster and model competition: a novel interest-generating teaching tool in the subject of pharmacology

    Directory of Open Access Journals (Sweden)

    Lois James Samuel

    2014-08-01

    Conclusions: The poster model competition did generate an interest in the topic. The students had a new avenue to express themselves and in the process gain more knowledge in an enjoyable manner. Learning is facilitated when students themselves play an important role in the learning process. Poster-model competition can be incorporated as a teaching-learning tool to encourage and motivate students who lack intrinsic motivation. [Int J Basic Clin Pharmacol 2014; 3(4.000: 649-655

  3. A System Identification Software Tool for General MISO ARX-Type of Model Structures

    OpenAIRE

    Lindskog, Peter

    1996-01-01

    The typical system identification procedure requires powerful and versatile software means. In this paper we describe and exemplify the use of a prototype identi#cation software tool, applicable for the rather broad class of multi input single output model structures with regressors that are formed by delayed in- and outputs. Interesting special instances of this model structure category include, e.g., linear ARX and many semi-physical structures, feed-forward neural networks, radial basis fu...

  4. OPSMODEL, an or-orbit operations simulation modeling tool for Space Station

    Science.gov (United States)

    Davis, William T.; Wright, Robert L.

    1988-01-01

    The 'OPSMODEL' operations-analysis and planning tool simulates on-orbit crew operations for the NASA Space Station, furnishing a quantitative measure of the effectiveness of crew activities in various alternative Station configurations while supporting engineering and cost analyses. OPSMODEL is entirely data-driven; the top-down modeling structure of the software allows the user to control both the content and the complexity level of model definition during data base population. Illustrative simulation samples are given.

  5. TREXMO: a translation tool to support the use of regulatory occupational exposure models

    OpenAIRE

    Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, Dernez

    2016-01-01

    Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EX...

  6. Rogeaulito: a world energy scenario modeling tool for transparent energy system thinking

    Directory of Open Access Journals (Sweden)

    Léo eBenichou

    2014-01-01

    Full Text Available Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  7. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  8. Large animal models of atherosclerosis--new tools for persistent problems in cardiovascular medicine.

    Science.gov (United States)

    Shim, J; Al-Mashhadi, R H; Sørensen, C B; Bentzon, J F

    2016-01-01

    Coronary heart disease and ischaemic stroke caused by atherosclerosis are leading causes of illness and death worldwide. Small animal models have provided insight into the fundamental mechanisms driving early atherosclerosis, but it is increasingly clear that new strategies and research tools are needed to translate these discoveries into improved prevention and treatment of symptomatic atherosclerosis in humans. Key challenges include better understanding of processes in late atherosclerosis, factors affecting atherosclerosis in the coronary bed, and the development of reliable imaging biomarker tools for risk stratification and monitoring of drug effects in humans. Efficient large animal models of atherosclerosis may help tackle these problems. Recent years have seen tremendous advances in gene-editing tools for large animals. This has made it possible to create gene-modified minipigs that develop atherosclerosis with many similarities to humans in terms of predilection for lesion sites and histopathology. Together with existing porcine models of atherosclerosis that are based on spontaneous mutations or severe diabetes, such models open new avenues for translational research in atherosclerosis. In this review, we discuss the merits of different animal models of atherosclerosis and give examples of important research problems where porcine models could prove pivotal for progress.

  9. Flexible global ocean-atmosphere-land system model. A modeling tool for the climate change research community

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin (eds.) [Chinese Academy of Sciences, Beijing, (China). Inst. of Atmospheric Physics

    2014-04-01

    First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.

  10. Statistical Tools for Fitting Models of the Population Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools II)

    Science.gov (United States)

    2015-09-30

    Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools II) Len Thomas, John Harwood, Catriona Harris, & Robert...build a coherent statistical framework for modeling the effects of disturbance, particularly acoustic disturbance, on different species of marine mammals ...are formulated 2 APPROACH Our technical approach has involved building and fitting statistical models to marine mammal data in order to quantify

  11. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  12. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  13. An Universal Modeling Method for Enhancing the Volumetric Accuracy of CNC Machine Tools

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Volumetric error modeling method is an important te ch nique for enhancement the accuracy of CNC machine tools by error compensation. I n the research field, the main question is how to find an universal kinematics m odeling method for different kinds of NC machine tools. Multi-body system theor y is always used to solve the dynamics problem of complex physical system. But t ill now, the error factors that always exist in practice system is still not con sidered. In this paper, the accuracy kinematics of MB...

  14. Techniques for the construction of an elliptical-cylindrical model using circular rotating tools in non CNC machines

    Energy Technology Data Exchange (ETDEWEB)

    Villalobos Mendoza, Brenda; Cordero Davila, Alberto [Benemerita Universidad Autonoma de Puebla, 4 Sur 104 Centra Historico C.P. 72000, Puebla, Pue. (Mexico); Gonzalez Garcia, Jorge, E-mail: bvillalobosmendoza@gmail.com [Universidad Tecnologica de la Mixteca, Carretera Huajuapan-Acatlima, Km 2.5, CP. 6900, Huajuapan de Leon, Oaxaca (Mexico)

    2011-01-01

    This paper describes the construction of an elliptical-cylindrical model without spherical aberration using vertical rotating tools. The engine of the circular tool is placed on one arm so that the tool fits on the surface and this in turn is moved by an X-Y table. The test method and computer algorithms that predict the desired wear are described.

  15. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process.

    Science.gov (United States)

    Araujo, Pablo Granda; Gras, Anna; Ginovart, Marta

    2016-01-01

    Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs) using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics) can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4 (+), NO3 (-), NO2 (-), N2) to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc). MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  16. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process

    Directory of Open Access Journals (Sweden)

    Pablo Araujo Granda

    2016-01-01

    Full Text Available Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4+, NO3−, NO2−, N2 to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc. MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  17. A Model for Air Flow in Ventilated Cavities Implemented in a Tool for Whole-Building Hygrothermal Analysis

    DEFF Research Database (Denmark)

    Grau, Karl; Rode, Carsten

    2006-01-01

    A model for calculating air flows in ventilated cavities has been implemented in the whole-building hygrothermal simulation tool BSim. The tool is able to predict indoor humidity conditions using a transient model for the moisture conditions in the building envelope.......A model for calculating air flows in ventilated cavities has been implemented in the whole-building hygrothermal simulation tool BSim. The tool is able to predict indoor humidity conditions using a transient model for the moisture conditions in the building envelope....

  18. Greenhouse gases from wastewater treatment — A review of modelling tools

    Energy Technology Data Exchange (ETDEWEB)

    Mannina, Giorgio, E-mail: giorgio.mannina@unipa.it [Dipartimento di Ingegneria Civile, Ambientale, Aerospaziale, dei Materiali, Università di Palermo, Viale delle Scienze, 90100 Palermo (Italy); Ekama, George [Water Research Group, Department of Civil Engineering, University of Cape Town, Rondebosch, 7700 Cape (South Africa); Caniani, Donatella [Department of Engineering and Physics of the Environment, University of Basilicata, viale dell' Ateneo Lucano 10, 85100 Potenza (Italy); Cosenza, Alida [Dipartimento di Ingegneria Civile, Ambientale, Aerospaziale, dei Materiali, Università di Palermo, Viale delle Scienze, 90100 Palermo (Italy); Esposito, Giovanni [Department of Civil and Mechanical Engineering, University of Cassino and the Southern Lazio, Via Di Biasio, 43, 03043 Cassino, FR (Italy); Gori, Riccardo [Department of Civil and Environmental Engineering, University of Florence, Via Santa Marta 3, 50139 Florence (Italy); Garrido-Baserba, Manel [Department of Civil & Environmental Engineering, University of California, Irvine, CA 92697-2175 (United States); Rosso, Diego [Department of Civil & Environmental Engineering, University of California, Irvine, CA 92697-2175 (United States); Water-Energy Nexus Center, University of California, Irvine, CA 92697-2175 (United States); Olsson, Gustaf [Department of Industrial Electrical Engineering and Automation (IEA), Lund University, Box 118, SE-22100 Lund (Sweden)

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N{sub 2}O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. - Highlights: • The state of the art in GHG production/emission/modelling from WWTPs was outlined. • Detailed mechanisms of N{sub 2}O production by AOB are still not completely known. • N{sub 2}O modelling could be improved considering both AOB pathways contribution. • To improve protocols the regulatory framework among countries has to be equalized. • Plant-wide modelling can help modeller/engineer/operator to reduce GHG emissions.

  19. DINAMO: a coupled sequence alignment editor/molecular graphics tool for interactive homology modeling of proteins.

    Science.gov (United States)

    Hansen, M; Bentz, J; Baucom, A; Gregoret, L

    1998-01-01

    Gaining functional information about a novel protein is a universal problem in biomedical research. With the explosive growth of the protein sequence and structural databases, it is becoming increasingly common for researchers to attempt to build a three-dimensional model of their protein of interest in order to gain information about its structure and interactions with other molecules. The two most reliable methods for predicting the structure of a protein are homology modeling, in which the novel sequence is modeled on the known three-dimensional structure of a related protein, and fold recognition (threading), where the sequence is scored against a library of fold models, and the highest scoring model is selected. The sequence alignment to a known structure can be ambiguous, and human intervention is often required to optimize the model. We describe an interactive model building and assessment tool in which a sequence alignment editor is dynamically coupled to a molecular graphics display. By means of a set of assessment tools, the user may optimize his or her alignment to satisfy the known heuristics of protein structure. Adjustments to the sequence alignment made by the user are reflected in the displayed model by color and other visual cues. For instance, residues are colored by hydrophobicity in both the three-dimensional model and in the sequence alignment. This aids the user in identifying undesirable buried polar residues. Several different evaluation metrics may be selected including residue conservation, residue properties, and visualization of predicted secondary structure. These characteristics may be mapped to the model both singly and in combination. DINAMO is a Java-based tool that may be run either over the web or installed locally. Its modular architecture also allows Java-literate users to add plug-ins of their own design.

  20. Development and application of modeling tools for sodium fast reactor inspection

    Science.gov (United States)

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan

    2014-02-01

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  1. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    Science.gov (United States)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  2. SuperLFV: An SLHA tool for lepton flavor violating observables in supersymmetric models

    CERN Document Server

    Murakami, Brandon

    2013-01-01

    We introduce SuperLFV, a numerical tool for calculating low-energy LFV observables in the context of the minimal supersymmetric standard model (MSSM). As the Large Hadron Collider and MEG, a dedicated mu -> e gamma experiment, are presently acquiring data, there is need for tools that provide rapid discrimination of models that exhibit lepton flavor violation (LFV). SuperLFV accepts an SLHA-compliant spectrum file that contains the MSSM couplings and masses with complex phases at the supersymmetry breaking scale. In this manner, SuperLFV is compatible with but divorced from existing SLHA spectrum calculators that provides the low energy spectrum. Hence, input spectra are not confined to the LFV sources provided by established SLHA spectrum calculators. Input spectra may be generated by personal code or by hand, allowing for arbitrary models not supported by existing spectrum calculators.

  3. CMS Partial Releases Model, Tools, and Applications. Online and Framework-Light Releases

    CERN Document Server

    Jones, Christopher D; Meschi, Emilio; Shahzad Muzaffar; Andreas Pfeiffer; Ratnikova, Natalia; Sexton-Kennedy, Elizabeth

    2009-01-01

    The CMS Software project CMSSW embraces more than a thousand packages organized in subsystems for analysis, event display, reconstruction, simulation, detector description, data formats, framework, utilities and tools. The release integration process is highly automated by using tools developed or adopted by CMS. Packaging in rpm format is a built-in step in the software build process. For several well-defined applications it is highly desirable to have only a subset of the CMSSW full package bundle. For example, High Level Trigger algorithms that run on the Online farm, and need to be rebuilt in a special way, require no simulation, event display, or analysis packages. Physics analysis applications in Root environment require only a few core libraries and the description of CMS specific data formats. We present a model of CMS Partial Releases, used for preparation of the customized CMS software builds, including description of the tools used, the implementation, and how we deal with technical challenges, suc...

  4. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    Science.gov (United States)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  5. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    Science.gov (United States)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be

  6. Applications and issues of GIS as tool for civil engineering modeling

    Science.gov (United States)

    Miles, S.B.; Ho, C.L.

    1999-01-01

    A tool that has proliferated within civil engineering in recent years is geographic information systems (GIS). The goal of a tool is to supplement ability and knowledge that already exists, not to serve as a replacement for that which is lacking. To secure the benefits and avoid misuse of a burgeoning tool, engineers must understand the limitations, alternatives, and context of the tool. The common benefits of using GIS as a supplement to engineering modeling are summarized. Several brief case studies of GIS modeling applications are taken from popular civil engineering literature to demonstrate the wide use and varied implementation of GIS across the discipline. Drawing from the case studies, limitations regarding traditional GIS data models find the implementation of civil engineering models within current GIS are identified and countered by discussing the direction of the next generation of GIS. The paper concludes by highlighting the potential for the misuse of GIS in the context of engineering modeling and suggests that this potential can be reduced through education and awareness. The goal of this paper is to promote awareness of the issues related to GIS-based modeling and to assist in the formulation of questions regarding the application of current GIS. The technology has experienced much publicity of late, with many engineers being perhaps too excited about the usefulness of current GIS. An undoubtedly beneficial side effect of this, however, is that engineers are becoming more aware of GIS and, hopefully, the associated subtleties. Civil engineers must stay informed of GIS issues and progress, but more importantly, civil engineers must inform the GIS community to direct the technology development optimally.

  7. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  8. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  9. Tools for Resilience Management: Multidisciplinary Development of State-and-Transition Models for Northwest Colorado

    Directory of Open Access Journals (Sweden)

    Emily J. Kachergis

    2013-12-01

    Full Text Available Building models is an important way of integrating knowledge. Testing and updating models of social-ecological systems can inform management decisions and, ultimately, improve resilience. We report on the outcomes of a six-year, multidisciplinary model development process in the sagebrush steppe, USA. We focused on creating state-and-transition models (STMs, conceptual models of ecosystem change that represent nonlinear dynamics and are being adopted worldwide as tools for managing ecosystems. STM development occurred in four steps with four distinct sets of models: (1 local knowledge elicitation using semistructured interviews; (2 ecological data collection using an observational study; (3 model integration using participatory workshops; and (4 model simplification upon review of the literature by a multidisciplinary team. We found that different knowledge types are ultimately complementary. Many of the benefits of the STM-building process flowed from the knowledge integration steps, including improved communication, identification of uncertainties, and production of more broadly credible STMs that can be applied in diverse situations. The STM development process also generated hypotheses about sagebrush steppe dynamics that could be tested by future adaptive management and research. We conclude that multidisciplinary development of STMs has great potential for producing credible, useful tools for managing resilience of social-ecological systems. Based on this experience, we outline a streamlined, participatory STM development process that integrates multiple types of knowledge and incorporates adaptive management.

  10. Agent-based model of laser hair removal: A treatment optimization and patient education tool

    Directory of Open Access Journals (Sweden)

    Eapen Bell

    2009-01-01

    Full Text Available Background: Tracking of various parameters associated with laser hair removal is tedious and time consuming. The currently available mathematical models are not simple enough for physicians to be used as a treatment optimization and patient education tool. Aim: The aim of the study was to develop a mathematical model for laser hair removal using agent-based modeling and to make a user-friendly simulation environment. Methods: The model was created using NetLogo. The hairs were modeled as agents oscillating between anagen and telogen. The variables were assigned based on published data whenever possible and the various paths the agent could take were coded as conditional statements. The improvement was assessed using an arbitrary index which takes into account the mean diameter and pigmentation along with the number and length of hairs visible above the surface. Few of the commonly encountered scenarios were simulated using the model. Results: The model is made freely available online (http://www.gulfdoctor.net/model/lhr.htm. Limited number of simulations performed indicated that an eight-week gap between laser sessions may be more effective than a four-week gap. Conclusions: The simulation provides a reliable tool for treatment optimization and patient education as obtaining relevant clinical data is slow and labor-intensive. Its visual interface and online availability makes it useful for everyday use.

  11. Extending the 4I Organizational Learning Model: Information Sources, Foraging Processes and Tools

    Directory of Open Access Journals (Sweden)

    Tracy A. Jenkin

    2013-08-01

    Full Text Available The continued importance of organizational learning has recently led to several calls for further developing the theory. This article addresses these calls by extending Crossan, Lane and White’s (1999 4I model to include a fifth process, information foraging, and a fourth level, the tool. The resulting 5I organizational learning model can be generalized to a number of learning contexts, especially those that involve understanding and making sense of data and information. Given the need for organizations to both innovate and increase productivity, and the volumes of data and information that are available to support both, the 5I model addresses an important organizational issue.

  12. Dynamic wind turbine models in power system simulation tool DIgSILENT

    DEFF Research Database (Denmark)

    Hansen, Anca Daniela; Iov, F.; Sørensen, Poul Ejnar

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... strategies have different goals e.g. fast response over disturbances, optimum power efficiency over a wider range of wind speeds, voltage ride-through capability including grid support. A dynamic model of a DC connection for active stall wind farms to the grid including the control is also implemented...

  13. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  14. Catchment Models and Management Tools for diffuse Contaminants (Sediment, Phosphorus and Pesticides): DIFFUSE Project

    Science.gov (United States)

    Mockler, Eva; Reaney, Simeon; Mellander, Per-Erik; Wade, Andrew; Collins, Adrian; Arheimer, Berit; Bruen, Michael

    2017-04-01

    The agricultural sector is the most common suspected source of nutrient pollution in Irish rivers. However, it is also often the most difficult source to characterise due to its predominantly diffuse nature. Particulate phosphorus in surface water and dissolved phosphorus in groundwater are of particular concern in Irish water bodies. Hence the further development of models and indices to assess diffuse sources of contaminants are required for use by the Irish Environmental Protection Agency (EPA) to provide support for river basin planning. Understanding connectivity in the landscape is a vital component of characterising the source-pathway-receptor relationships for water-borne contaminants, and hence is a priority in this research. The DIFFUSE Project will focus on connectivity modelling and incorporation of connectivity into sediment, nutrient and pesticide risk mapping. The Irish approach to understanding and managing natural water bodies has developed substantially in recent years assisted by outputs from multiple research projects, including modelling and analysis tools developed during the Pathways and CatchmentTools projects. These include the Pollution Impact Potential (PIP) maps, which are an example of research output that is used by the EPA to support catchment management. The PIP maps integrate an understanding of the pollution pressures and mobilisation pathways and, using the source-pathways-receptor model, provide a scientific basis for evaluation of mitigation measures. These maps indicate the potential risk posed by nitrate and phosphate from diffuse agricultural sources to surface and groundwater receptors and delineate critical source areas (CSAs) as a means of facilitating the targeting of mitigation measures. Building on this previous research, the DIFFUSE Project will develop revised and new catchment managements tools focused on connectivity, sediment, phosphorus and pesticides. The DIFFUSE project will strive to identify the state

  15. A trade-off analysis design tool. Aircraft interior noise-motion/passenger satisfaction model

    Science.gov (United States)

    Jacobson, I. D.

    1977-01-01

    A design tool was developed to enhance aircraft passenger satisfaction. The effect of aircraft interior motion and noise on passenger comfort and satisfaction was modelled. Effects of individual aircraft noise sources were accounted for, and the impact of noise on passenger activities and noise levels to safeguard passenger hearing were investigated. The motion noise effect models provide a means for tradeoff analyses between noise and motion variables, and also provide a framework for optimizing noise reduction among noise sources. Data for the models were collected onboard commercial aircraft flights and specially scheduled tests.

  16. Statistical Tools for Fitting Models of the Population Consequences of Acoustic Disturbance to Data from Marine Mammal Populations (PCAD Tools 2)

    Science.gov (United States)

    2013-09-30

    and A. M. Kilpatrick. 2014. A Bioenergetics Approach to Understanding the Population Consequences of Disturbance: Elephant seals as a Model System. in...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Statistical Tools for Fitting Models of the Population...S. Schick Centre for Research info Ecological and Environmental Modelling (CREEM) University of St. Andrews St. Andrews, KY16 9LZ, UK phone: +44

  17. Dynamic wind turbine models in power system simulation tool DIgSILENT

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, A.C.; Jauch, C.; Soerensen, P.; Iov, F.; Blaabjerg, F.

    2003-12-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). The initialisation issues on the wind turbine models into the power system simulation are also presented. However, the main attention in this report is drawn to the modelling at the system level of two wind turbine concepts: 1. Active stall wind turbine with induction generator 2. Variable speed, variable pitch wind turbine with doubly fed induction generator. These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies, connection of the wind turbine at different types of grid and storage systems. For both these two concepts, control strategies are developed and implemented, their performance assessed and discussed by means of simulations. (au)

  18. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    Science.gov (United States)

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  19. Big Data’s Tools for Internet Data Analytics: Modelling of System Dynamics

    Directory of Open Access Journals (Sweden)

    Feldiansyah Nasution

    2017-06-01

    Full Text Available In this paper, an application based on Apache Hadoop is deployed to gather, store and analyse the data from the internet, especially online and social media. Nowadays, this application is a common tool for media analysis. In our case, it is used to assist in the modelling of system dynamics. Basically, There are several tools that will be used, such as for file system, data crawling from the Internet, data indexing, data storage, and data analytics. The selection of technology is as the industrial trend. Surely, this is not the best approach, but as another perspective for modelling of system dynamics. A system dynamics model is developed to study the profitability of the telecommunication company and how the complaint or negative sentiment will impact to their profits. The clustering analytics are used to identify the components of the system. In continuation of the improvement process, the clustering analytics will be used not only as a one time effort. It runs periodically to develop a better model of the system. Sentiment analysis tool is used as the input for one of the components which is compliant components. The sentiment is sourced from online and social media. Manual investigation and analytics of internet data is required in developing the relationships between the components.

  20. FEMSA: A Finite Element Simulation Tool for Quasi-static Seismic Deformation Modeling

    Science.gov (United States)

    Volpe, M.; Melini, D.; Piersanti, A.

    2006-12-01

    Modeling postseismic deformation is an increasingly valuable tool in earthquake seismology. In particular, the Finite Element (FE) numerical method allows accurate modeling of complex faulting geometry, inhomogeneous materials and realistic viscous flow, appearing an excellent tool to investigate a lot of specific phenomena related with earthquakes. We developed a FE simulation tool, FEMSA (Finite Element Modeling for Seismic Applications), to model quasi-static deformation generated by faulting sources. The approach allows to automatically implement arbitrary faulting sources and calculate displacement and stress fields induced by slip on the fault. The package makes use of the capabilities of CalculiX, a non commercial FE software designed to solve field problems, and is freely distributed. The main advantages of the method are: reliability, wide diffusion and flexibility, allowing geometrical and/or rheological heterogeneities to be included in a mechanical analysis. We carried out an optimization study on boundary conditions as well as a series of benchmark simulations on test cases and we also verified the capability of our approach to face the presence of 3D heterogeneities within the domain. Here, we present our package and show some simple examples of application.

  1. The Business Model Evaluation Tool for Smart Cities: Application to SmartSantander Use Cases

    Directory of Open Access Journals (Sweden)

    Raimundo Díaz-Díaz

    2017-02-01

    Full Text Available New technologies open up the door to multiple business models applied to public services in smart cities. However, there is not a commonly adopted methodology for evaluating business models in smart cities that can help both practitioners and researchers to choose the best option. This paper addresses this gap introducing the Business Model Evaluation Tool for Smart Cities. This methodology is a simple, organized, flexible and the transparent system that facilitates the work of the evaluators of potential business models. It is useful to compare two or more business models and take strategic decisions promptly. The method is part of a previous process of content analysis and it is based on the widely utilized Business Model Canvas. The evaluation method has been assessed by 11 experts and, subsequently it has been validated applying it to the case studies of Santander’s waste management and street lighting systems, which take advantage of innovative technologies commonly used in smart cities.

  2. Life-cycle modelling of waste management in Europe: tools, climate change and waste prevention

    DEFF Research Database (Denmark)

    Gentil, Emmanuel

    of these models most importantly depend on the technical assumptions and parameters defining waste management technologies. Some of these technical assumptions have evolved significantly from the early models to the more recent ones. An important purpose of waste LCA models is to perform environmental assessments......Europe has a long history of waste management, where regulation, implementation and enforcement have been the main drivers for the development and diversification of waste management technologies since the late 70s. Despite strong engineering development to minimise impacts to human health...... disposal to resources management, requiring modelling tools, such as life-cycle assessment (LCA) models, for carrying out environmental assessment, because of the complexity of the systems. A review of the key waste LCA models was performed in the present PhD project and showed that the results...

  3. Testing and thermal modeling of radiant panels systems as commissioning tool

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca Diaz, Nestor, E-mail: njfonseca@doct.ulg.ac.b [University of Liege Belgium, Thermodynamics Laboratory, Campus du Sart Tilman, Bat: B49, P33, B-4000 Liege (Belgium); Universidad Tecnologica de Pereira, Facultad de Ingenieria Mecanica, AA 97 Pereira (Colombia); Cuevas, Cristian [Universidad de Concepcion, Facultad de Ingenieria, Departamento de Ingenieria Mecanica, Casilla 160c Concepcion (Chile)

    2010-12-15

    This paper presents the results of a study performed to develop a thermal modeling of radiant panels systems to be used in situ, as diagnosis tool in commissioning processes to determine the main operating conditions of the system in cooling or heating mode. The model considers the radiant panels as a finned heat exchanger in dry regime. By using as inputs the ceiling and room dimensions, the radiant ceiling material properties and the measurements of air and water mass flow rates and temperatures, the model is able to calculate the radiant ceiling capacity, ceiling surface average temperature, water exhaust temperature and resultant temperature as a comfort indicator. The modeling proposed considers combined convection, perforation effect and a detailed radiative heat exchange method for radiant ceiling systems. An example of each system considered in this study is shown, illustrating the validation of the model. A sensitive analysis of the model is performed.

  4. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Science.gov (United States)

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified).

  5. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  6. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    Science.gov (United States)

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  7. A Neural-FEM tool for the 2-D magnetic hysteresis modeling

    Energy Technology Data Exchange (ETDEWEB)

    Cardelli, E. [University of Perugia, Department of Engineering, Via G. Duranti 93, 06125 Perugia (Italy); Faba, A., E-mail: antonio.faba@unipg.it [University of Perugia, Department of Engineering, Via G. Duranti 93, 06125 Perugia (Italy); Laudani, A.; Lozito, G.M.; Riganti Fulginei, F.; Salvini, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy)

    2016-04-01

    The aim of this work is to present a new tool for the analysis of magnetic field problems considering 2-D magnetic hysteresis. In particular, this tool makes use of the Finite Element Method to solve the magnetic field problem in real device, and fruitfully exploits a neural network (NN) for the modeling of 2-D magnetic hysteresis of materials. The NS has as input the magnetic inductions components B at the k-th simulation step and returns as output the corresponding values of the magnetic field H corresponding to the input pattern. It is trained by vector measurements performed on the magnetic material to be modeled. This input/output scheme is directly implemented in a FEM code employing the magnetic potential vector A formulation. Validations through measurements on a real device have been performed.

  8. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    Science.gov (United States)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  9. Bayesian networks modeling for thermal error of numerical control machine tools

    Institute of Scientific and Technical Information of China (English)

    Xin-hua YAO; Jian-zhong FU; Zi-chen CHEN

    2008-01-01

    The interaction between the heat source location,its intensity,thermal expansion coefficient,the machine system configuration and the running environment creates complex thermal behavior of a machine tool,and also makes thermal error prediction difficult.To address this issue,a novel prediction method for machine tool thermal error based on Bayesian networks (BNs) was presented.The method described causal relationships of factors inducing thermal deformation by graph theory and estimated the thermal error by Bayesian statistical techniques.Due to the effective combination of domain knowledge and sampled data,the BN method could adapt to the change of running state of machine,and obtain satisfactory prediction accuracy.Ex-periments on spindle thermal deformation were conducted to evaluate the modeling performance.Experimental results indicate that the BN method performs far better than the least squares(LS)analysis in terms of modeling estimation accuracy.

  10. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Directory of Open Access Journals (Sweden)

    Johannes Stegmaier

    Full Text Available Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU

  11. Modeling tools for the assessment of microbiological risks during floods: a review

    Science.gov (United States)

    Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin

    2015-04-01

    Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing

  12. A Hierarchical Slicing Tool Model%一个分层切片工具模型

    Institute of Scientific and Technical Information of China (English)

    谭毅; 朱平; 李必信; 郑国梁

    2001-01-01

    Most of the traditional methods of slicing are based on dependence graph. But constructing dependence graph for object oriented programs directly is very complicated. The design and implementation of a hierarchical slicing tool model are described. By constructing the package level dependence graph, class level dependence graph, method level dependence graph and statement level dependence graph, package level slice, class level slice, method level slice and program slice are obtained step by step.

  13. Bayesian network as a modelling tool for risk management in agriculture

    OpenAIRE

    Svend Rasmussen; Madsen, Anders L.; Mogens Lund

    2013-01-01

    The importance of risk management increases as farmers become more exposed to risk. But risk management is a difficult topic because income risk is the result of the complex interaction of multiple risk factors combined with the effect of an increasing array of possible risk management tools. In this paper we use Bayesian networks as an integrated modelling approach for representing uncertainty and analysing risk management in agriculture. It is shown how historical farm account data may be e...

  14. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  15. Spatially-explicit LCIA model for marine eutrophication as a tool for sustainability assessment

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Hauschild, Michael Zwicky

    2014-01-01

    degradation. Life Cycle Assessment (LCA) is as a tool to comparatively quantify the environmental impacts from product systems throughout their life cycle. Marine eutrophication is one of the LC Impact Assessment (LCIA) categories and it is still lacking an overall model linking nutrients over...... into impact assessment methods in LCA to help characterizing the eutrophication impact of product systems related to agricultural production or involving combustion processes, and ultimately to assess the environmental sustainability of human activities....

  16. Diamond and cBN hybrid and nanomodified cutting tools with enhanced performance: Development, testing and modelling

    DEFF Research Database (Denmark)

    Loginov, Pavel; Mishnaevsky, Leon; Levashov, Evgeny

    2015-01-01

    The potential of enhancement of superhard steel and cast iron cutting tool performance on the basis of microstuctural modifications of the tool materials is studied. Hybrid machining tools with mixed diamond and cBN grains, as well as machining tool with composite nanomodified metallic binder...... are developed, and tested experimentally and numerically. It is demonstrated that both combination of diamond and cBN (hybrid structure) and nanomodification of metallic binder (with hexagonal boron nitride/hBN platelets) lead to sufficient improvement of the cast iron machining performance. The superhard tools...... compared to the tool with the original binder. Computational model of hybrid superhard tools is developed, and applied to the analysis of structure-performance relationships of the tools....

  17. phenix.model_vs_data: a high-level tool for the calculation of crystallographic model and data statistics.

    Science.gov (United States)

    Afonine, Pavel V; Grosse-Kunstleve, Ralf W; Chen, Vincent B; Headd, Jeffrey J; Moriarty, Nigel W; Richardson, Jane S; Richardson, David C; Urzhumtsev, Alexandre; Zwart, Peter H; Adams, Paul D

    2010-08-01

    phenix.model_vs_data is a high-level command-line tool for the computation of crystallographic model and data statistics, and the evaluation of the fit of the model to data. Analysis of all Protein Data Bank structures that have experimental data available shows that in most cases the reported statistics, in particular R factors, can be reproduced within a few percentage points. However, there are a number of outliers where the recomputed R values are significantly different from those originally reported. The reasons for these discrepancies are discussed.

  18. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  19. A software tool to assess uncertainty in transient-storage model parameters using Monte Carlo simulations

    Science.gov (United States)

    Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.

    2017-01-01

    Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.

  20. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    Science.gov (United States)

    Gupta, Saurabh; Black-Schaffer, W. Stephen; Crawford, James M.; Gross, David; Karcher, Donald S.; Kaufman, Jill; Knapman, Doug; Prystowsky, Michael B.; Wheeler, Thomas M.; Bean, Sarah; Kumar, Paramhans; Sharma, Raghav; Chamoli, Vaibhav; Ghai, Vikrant; Gogia, Vineet; Weintraub, Sally; Cohen, Michael B.

    2015-01-01

    Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply) of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories), service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models. PMID:28725751

  1. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    Directory of Open Access Journals (Sweden)

    Saurabh Gupta BPharm

    2015-10-01

    Full Text Available Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories, service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models.

  2. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    Directory of Open Access Journals (Sweden)

    Thomas E Gorochowski

    Full Text Available Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher.

  3. OMNIITOX - operational life-cycle impact assessment models and information tools for practitioners

    DEFF Research Database (Denmark)

    Molander, S; Lidholm, Peter; Schowanek, Diederik

    2004-01-01

    of this case study-driven project are briefly presented and put in relation to the aims of contributing to an operational life cycle-impact assessment (LCIA) model for impacts of toxicants. The present situation has been characterised by methodological difficulties, both regarding choice......This article is the preamble to a set of articles describing initial results from an on-going European Commission funded, 5th Framework project called OMNIITOX, Operational Models aNd Information tools for Industrial applications of eco/TOXicological impact assessments. The different parts...... of the characterisation model(s) and limited input data on chemical properties, which often has resulted in the omission of toxicants from the LCIA, or at best focus on well characterised chemicals. The project addresses both problems and integrates models, as well as data, in an information system – the OMNIITOX IS...

  4. FAST: A Fuel And Sheath Modeling Tool for CANDU Reactor Fuel

    Science.gov (United States)

    Prudil, Andrew Albert

    Understanding the behaviour of nuclear fuel during irradiation is a complicated multiphysics problem involving neutronics, chemistry, radiation physics, material-science, solid mechanics, heat transfer and thermal-hydraulics. Due to the complexity and interdependence of the physics and models involved, fuel modeling is typically clone with numerical models. Advancements in both computer hardware and software have made possible new more complex and sophisticated fuel modeling codes. The Fuel And Sheath modelling Tool (FAST) is a fuel performance code that has been developed for modeling nuclear fuel behaviour under normal and transient conditions. The FAST code includes models for heat generation and transport, thermal expansion, elastic strain, densification, fission product swelling, pellet relocation, contact, grain growth, fission gas release, gas and coolant pressure and sheath creep. These models are coupled and solved numerically using the Comsol Multiphysics finite-element platform. The model utilizes a radialaxial geometry of a fuel pellet (including dishing and chamfering) and accompanying fuel sheath allowing the model to predict circumferential ridging. This model has evolved from previous treatments developed at the Royal Military College. The model has now been significantly advanced to include: a more detailed pellet geometry, localized pellet-to-sheath gap size and contact pressure, ability to model cracked pellets, localized fuel burnup for material property models, improved U02 densification behaviour, fully 2-dimensional model for the sheath, additional creep models, additional material models, an FEM Booth-diffusion model for fission gas release (including ability to model temperature and power changes), a capability for end-of-life predictions, the ability to utilize text files as model inputs, and provides a first time integration of normal operating conditions (NOC) and transient fuel models into a single code (which has never been achieved

  5. Tav4SB: integrating tools for analysis of kinetic models of biological systems

    Directory of Open Access Journals (Sweden)

    Rybiński Mikołaj

    2012-04-01

    Full Text Available Abstract Background Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. Results The Taverna services for Systems Biology (Tav4SB project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project’s Web page: http://bioputer.mimuw.edu.pl/tav4sb/. Conclusions The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  6. GIS as an Integration Tool for Hydrologic Modeling: Spatial Data Management, Analysis and Visualization

    Science.gov (United States)

    Setegn, S. G.; Lawrence, A.; Mahmoudi, M.

    2015-12-01

    The Applied Research Center at Florida International University (ARC-FIU) is supporting the soil and groundwater remediation efforts of the U.S. Department of Energy (DOE) Savannah River Site (SRS) by developing a surface water model to simulate the hydrology and the fate and transport of contaminants and sediment in the Tims Branch watershed. The first phase of model development was initiated in 2014 using the MIKE SHE/MIKE 11 hydrological modeling package which has a geographic information systems (GIS) user interface built into its system that can directly use spatial GIS databases (geodatabases) for model inputs. This study developed an ArcGIS geodatabase to support the hydrological modeling work for SRS. The coupling of a geodatabase with MIKE SHE/MIKE 11 numerical models can serve as an efficient tool that significantly reduces the time needed for data preparation. The geodatabase provides an advanced spatial data structure needed to address the management, processing, and analysis of large GIS and timeseries datasets derived from multiple sources that are used for numerical model calibration, uncertainty analysis, and simulation of flow and contaminant fate and transport during extreme climatic events. The geodatabase developed is based on the ArcHydro and ArcGIS Base Map data models with modifications made for project specific input parameters. The significance of this approach was to ensure its replicability for potential application in other watersheds. This paper describes the process of development of the SRS geodatabase and the application of GIS tools to pre-process and analyze hydrological model data; automate repetitive geoprocessing tasks; and produce maps for visualization of the surface water hydrology of the Tims Branch watershed. Key Words: GIS, hydrological modeling, geodatabase, hydrology, MIKE SHE/MIKE 11

  7. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.

  8. Modeling the Effects of Tool Shoulder and Probe Profile Geometries on Friction Stirred Aluminum Welds Using Response Surface Methodology

    Institute of Scientific and Technical Information of China (English)

    H.K.Mohanty; M.M.Mahapatra; P.Kumar; P.Biswas; N.R.Mandal

    2012-01-01

    The present paper discusses the modeling of tool geometry effects on the friction stir aluminum welds using response surface methodology.The friction stir welding tools were designed with different shoulder and tool probe geometries based on a design matrix.The matrix for the tool designing was made for three types of tools,based on three types of probes,with three levels each for defining the shoulder surface type and probe profile geometries.Then,the effects of tool shoulder and probe geometries on friction stirred aluminum welds were experimentally investigated with respect to weld strength,weld cross section area,grain size of weld and grain size of thermo-mechanically affected zone.These effects were modeled using multiple and response surface regression analysis.The response surface regression modeling were found to be appropriate for defining the friction stir weldment characteristics.

  9. MoManI: a tool to facilitate research, analysis, and teaching of computer models

    Science.gov (United States)

    Howells, Mark; Pelakauskas, Martynas; Almulla, Youssef; Tkaczyk, Alan H.; Zepeda, Eduardo

    2017-04-01

    Allocating limited resource efficiently is a task to which efficient planning and policy design aspires. This may be a non-trivial task. For example, the seventh sustainable development goal (SDG) of Agenda 2030 is to provide access to affordable sustainable energy to all. On the one hand, energy is required to realise almost all other SDGs. (A clinic requires electricity for fridges to store vaccines for maternal health, irrigate agriculture requires energy to pump water to crops in dry periods etc.) On the other hand, the energy system is non-trivial. It requires the mapping of resource, its conversion into useable energy and then into machines that we use to meet our needs. That requires new tools that draw from standard techniques, best-in-class models and allow the analyst to develop new models. Thus we present the Model Management Infrastructure (MoManI). MoManI is used to develop, manage, run, store input and results data for linear programming models. MoManI, is a browser-based open source interface for systems modelling. It is available to various user audiences, from policy makers and planners through to academics. For example, we implement the Open Source energy Modelling System (OSeMOSYS) in MoManI. OSeMOSYS is a specialized energy model generator. A typical OSeMOSYS model would represent the current energy system of a country, region or city; in it, equations and constraints are specified; and calibrated to a base year. From that future technologies and policy options are represented. From those scenarios are designed and run. Efficient allocation of energy resource and expenditure on technology is calculated. Finally, results are visualized. At present this is done in relatively rigid interfaces or via (for some) cumbersome text files. Implementing and operating OSeMOSYS in MoManI shortens the learning curve and reduces phobia associated with the complexity of computer modelling, thereby supporting effective capacity building activities. The novel

  10. SURFACE ROUGHNESS PREDICTION MODEL FOR ULTRAPRECISION TURNING ALUMINIUM ALLOYWITH A SINGLE CRYSTAL DIAMOND TOOL

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A surface roughness model utilizing regression analysis method is developed for predicting roughness of ultraprecision machined surface with a single crystal diamond tool. The effects of the main variables,such as cutting speed,feed,and depth of cut on surface roughness are also analyzed in diamond turning aluminum alloy. In order to predict and control the surface roughness before ultraprecision machining,constrained variable metric method is used to select the optimum cutting conditions during process planning. A lot of experimental results show that the model can predict the surface roughness effectively under a certain cutting conditions .

  11. An interdisciplinary framework for participatory modeling design and evaluation—What makes models effective participatory decision tools?

    Science.gov (United States)

    Falconi, Stefanie M.; Palmer, Richard N.

    2017-02-01

    Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.

  12. A prototype computer-aided modelling tool for life-support system models

    Science.gov (United States)

    Preisig, H. A.; Lee, Tae-Yeong; Little, Frank

    1990-01-01

    Based on the canonical decomposition of physical-chemical-biological systems, a prototype kernel has been developed to efficiently model alternative life-support systems. It supports (1) the work in an interdisciplinary group through an easy-to-use mostly graphical interface, (2) modularized object-oriented model representation, (3) reuse of models, (4) inheritance of structures from model object to model object, and (5) model data base. The kernel is implemented in Modula-II and presently operates on an IBM PC.

  13. DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.

    Directory of Open Access Journals (Sweden)

    Stefan K Lhachimi

    Full Text Available BACKGROUND: Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA. A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states and three usability criteria (modest data requirements, rich model output, generally accessible to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment, we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. METHODS AND RESULTS: DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. CONCLUSION: By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based

  14. Community Intercomparison Suite (CIS) v1.4.0: a tool for intercomparing models and observations

    Science.gov (United States)

    Watson-Parris, Duncan; Schutgens, Nick; Cook, Nicholas; Kipling, Zak; Kershaw, Philip; Gryspeerdt, Edward; Lawrence, Bryan; Stier, Philip

    2016-09-01

    The Community Intercomparison Suite (CIS) is an easy-to-use command-line tool which has been developed to allow the straightforward intercomparison of remote sensing, in situ and model data. While there are a number of tools available for working with climate model data, the large diversity of sources (and formats) of remote sensing and in situ measurements necessitated a novel software solution. Developed by a professional software company, CIS supports a large number of gridded and ungridded data sources "out-of-the-box", including climate model output in NetCDF or the UK Met Office pp file format, CloudSat, CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization), MODIS (MODerate resolution Imaging Spectroradiometer), Cloud and Aerosol CCI (Climate Change Initiative) level 2 satellite data and a number of in situ aircraft and ground station data sets. The open-source architecture also supports user-defined plugins to allow many other sources to be easily added. Many of the key operations required when comparing heterogenous data sets are provided by CIS, including subsetting, aggregating, collocating and plotting the data. Output data are written to CF-compliant NetCDF files to ensure interoperability with other tools and systems. The latest documentation, including a user manual and installation instructions, can be found on our website (http://cistools.net). Here, we describe the need which this tool fulfils, followed by descriptions of its main functionality (as at version 1.4.0) and plugin architecture which make it unique in the field.

  15. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  16. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario.

    Science.gov (United States)

    Ghanate, A D; Kothiwale, S; Singh, S P; Bertrand, Dominique; Krishna, C Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  17. SModelS: A Tool for Making Systematic Use of Simplified Models Results

    Science.gov (United States)

    Waltenberger, Wolfgang; SModelS Group.

    2016-10-01

    We present an automated software tool ”SModelS” to systematically confront theories Beyond the Standard Model (BSM) with experimental data. The tool consists of a general procedure to decompose such BSM theories into their Simplified Models Spectra (SMS). In addition, SModelS features a database containing the majority of the published SMS results of CMS and ATLAS. These results consist of the 95% confidence level upper limits on signal production cross sections. The two components together allow us to quickly confront any BSM model with LHC results. As a show-case example we will briefly discuss an application of our procedure to a specific supersymmetric model. It is one of our ongoing efforts to extend the framework to include also efficiency maps produced either by the experimental collaborations, by efforts performed within the phenomenological groups, or possibly also by ourselves. While the current implementation can handle null results only, it is our ultimate goal to build the Next Standard Model in a bottom-up fashion from both negative and positive results of several experiments. The implementation is open source, written in python, and available from http://smodels.hephy.at.

  18. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Theoretical Physics Department, Geneva (Switzerland); Athron, Peter [Monash University, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Melbourne, VIC (Australia); Basso, Lorenzo [CPPM, Aix-Marseille Universite, CNRS-IN2P3, UMR 7346, Marseille Cedex 9 (France); Goodsell, Mark D. [Sorbonne Universites, LPTHE, UMR 7589, CNRS and Universite Pierre et Marie Curie, Paris Cedex 05 (France); Harries, Dylan [The University of Adelaide, Department of Physics, ARC Centre of Excellence for Particle Physics at the Terascale, Adelaide, SA (Australia); Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby [Bethe Center for Theoretical Physics and Physikalisches Institut der Universitaet Bonn, Bonn (Germany); Ubaldi, Lorenzo [Tel-Aviv University, Raymond and Beverly Sackler School of Physics and Astronomy, Tel Aviv (Israel); Vicente, Avelino [Instituto de Fisica Corpuscular (CSIC-Universitat de Valencia), Valencia (Spain); Voigt, Alexander [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2016-09-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model. (orig.)

  19. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Geneva (Switzerland). Theoretical Physics Dept.; Athron, Peter [Monash Univ., Melbourne (Australia). ARC Center of Excellence for Particle Physics at the Terascale; Basso, Lorenzo [Aix-Marseille Univ., CNRS-IN2P3, UMR 7346 (France). CPPM; and others

    2016-02-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  20. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario

    Science.gov (United States)

    Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  1. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  2. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Science.gov (United States)

    Staub, Florian; Athron, Peter; Basso, Lorenzo; Goodsell, Mark D.; Harries, Dylan; Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby; Ubaldi, Lorenzo; Vicente, Avelino; Voigt, Alexander

    2016-09-01

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  3. Visinets: a web-based pathway modeling and dynamic visualization tool.

    Directory of Open Access Journals (Sweden)

    Jozef Spychala

    Full Text Available In this report we describe a novel graphically oriented method for pathway modeling and a software package that allows for both modeling and visualization of biological networks in a user-friendly format. The Visinets mathematical approach is based on causal mapping (CMAP that has been fully integrated with graphical interface. Such integration allows for fully graphical and interactive process of modeling, from building the network to simulation of the finished model. To test the performance of Visinets software we have applied it to: a create executable EGFR-MAPK pathway model using an intuitive graphical way of modeling based on biological data, and b translate existing ordinary differential equation (ODE based insulin signaling model into CMAP formalism and compare the results. Our testing fully confirmed the potential of the CMAP method for broad application for pathway modeling and visualization and, additionally, showed significant advantage in computational efficiency. Furthermore, we showed that Visinets web-based graphical platform, along with standardized method of pathway analysis, may offer a novel and attractive alternative for dynamic simulation in real time for broader use in biomedical research. Since Visinets uses graphical elements with mathematical formulas hidden from the users, we believe that this tool may be particularly suited for those who are new to pathway modeling and without the in-depth modeling skills often required when using other software packages.

  4. Hydrological model of the Lower Biebrza Basin; using the model as a management tool

    NARCIS (Netherlands)

    Nauta, A.B.; Bielecka, J.; Querner, E.P.

    2005-01-01

    The main objective of the research is the setup of a groundwater model which can be used for predicting consequences of different land use management options. The constructed Lower Biebrza model is based on the SIMGRO model (SIMulation of GROundwater and surface water levels), a distributed physical

  5. Improving Water Management Decision Support Tools Using NASA Satellite and Modeling Data

    Science.gov (United States)

    Toll, D. L.; Arsenault, K.; Nigro, J.; Pinheiro, A.; Engman, E. T.; Triggs, J.; Cosgrove, B.; Alonge, C.; Boyle, D.; Allen, R.; Townsend, P.; Ni-Meister, W.

    2006-05-01

    One of twelve Applications of National priority within NASA's Applied Science Program, the Water Management Program Element addresses concerns and decision making related to water availability, water forecast and water quality. The goal of the Water Management Program Element is to encourage water management organizations to use NASA Earth science data, models products, technology and other capabilities in their decision support tools for problem solving. The Water Management Program Element partners with Federal agencies, academia, private firms, and may include international organizations. This paper further describes the Water Management Program with the objective of informing the applications community of the potential opportunities for using NASA science products for problem solving. We will illustrate some ongoing and application Water Management projects evaluating and benchmarking NASA data with partnering federal agencies and their decision support tools: 1) Environmental Protection Agency for water quality; 2) Bureau of Reclamation for water supply, demand and forecast; and 3) NOAA National Weather Service for improved weather prediction. Examples of the types of NASA contributions to the these agency decision support tools include: 1) satellite observations within models assist to estimate water storage, i.e., snow water equivalent, soil moisture, aquifer volumes, or reservoir storages; 2) model derived products, i.e., evapotranspiration, precipitation, runoff, ground water recharge, and other 4-dimensional data assimilation products; 3) improve water quality, assessments by using improved inputs from NASA models (precipitation, evaporation) and satellite observations (e.g., temperature, turbidity, land cover) to nonpoint source models; and 4) water (i.e., precipitation) and temperature predictions from days to decades over local, regional and global scales.

  6. NOTATION TOOLS OF BUSINESS MODELING OF THE SERVICES ON REAL ESTATE MARKET

    Directory of Open Access Journals (Sweden)

    Mishlanova Marina Yur’evna

    2016-04-01

    Full Text Available The article is devoted to the development of the main provisions of realtor business modeling. In the paper the development of notational complex is presented, which is involved in the design of the conceptual model, the formation of a reference model of real estate business and basic rules for the implementation of the model. In the construction of the proposed model important notational aspects are highlighted. Functional orientation of real estate business for rendering services reflects a functional approach to business modeling. In order to ensure the assessment of the offered services it is proposed to implement a nested model of the object. A reasonable functional approach using object-based elements allows optimizing the processes of business modeling and assessment of the results. The article discusses functional modeling of business, focusing on the results. Synchronizing the functional model with the models of business processes and sub-models of objects, in particular, the model of business result, contributes to the improvement of the notations tools. The article presents the adaptation of the template of the business model to the conditions of the realtor activity. The proposed reference model specifies the logical scheme of decomposition activity, which detaches economic, social and other values. The decomposition of services into functional groups with account for individual values and functional modules is presented: buying and selling real estate; mortgages and loans; rent of residential and commercial property; an independent evaluation of real estate; consultations concerning the issues of real estate transactions. In the focus of the results of business processes and performance standards of realtor organizations transitional notation to the evaluation system efficiency of business performance is developed. The simplest method of feedback for assessing customer satisfaction and, consequently, system efficiency is offered

  7. Bayesian Reliability Modeling and Assessment Solution for NC Machine Tools under Small-sample Data

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaojun; KAN Yingnan; CHEN Fei; XU Binbin; CHEN Chuanhai; YANG Chuangui

    2015-01-01

    Although Markov chain Monte Carlo(MCMC) algorithms are accurate, many factors may cause instability when they are utilized in reliability analysis; such instability makes these algorithms unsuitable for widespread engineering applications. Thus, a reliability modeling and assessment solution aimed at small-sample data of numerical control(NC) machine tools is proposed on the basis of Bayes theories. An expert-judgment process of fusing multi-source prior information is developed to obtain the Weibull parameters’ prior distributions and reduce the subjective bias of usual expert-judgment methods. The grid approximation method is applied to two-parameter Weibull distribution to derive the formulas for the parameters’ posterior distributions and solve the calculation difficulty of high-dimensional integration. The method is then applied to the real data of a type of NC machine tool to implement a reliability assessment and obtain the mean time between failures(MTBF). The relative error of the proposed method is 5.8020×10-4 compared with the MTBF obtained by the MCMC algorithm. This result indicates that the proposed method is as accurate as MCMC. The newly developed solution for reliability modeling and assessment of NC machine tools under small-sample data is easy, practical, and highly suitable for widespread application in the engineering field; in addition, the solution does not reduce accuracy.

  8. RHydro - Hydrological models and tools to represent and analyze hydrological data in R

    Science.gov (United States)

    Reusser, D. E.; Buytaert, W.; Vitolo, C.

    2012-04-01

    In hydrology, basic equations and procedures keep being implemented from scratch by scientist, with the potential for errors and inefficiency. The use of libraries can overcome these problems. As an example, hydrological libraries could contain: 1. Major representations of hydrological processes such as infiltration, sub-surface runoff and routing algorithms. 2. Scaling functions, for instance to combine remote sensing precipitation fields with rain gauge data 3. Data consistency checks 4. Performance measures. Here we present a beginning for such a library implemented in the high level data programming language R. Currently, Top-model, the abc-Model, HBV, a multi-model ensamble called FUSE, data import routines for WaSiM-ETH as well basic visualization and evaluation tools are implemented. Care is taken to make functions and models compatible with other existing frameworks in hydrology, such as for example Hydromad.

  9. Progress in the Development of an Integrated Modeling Tool to Support DIII-D and EAST

    Science.gov (United States)

    Ren, Q.; Lao, L. L.; Chu, M. S.; St. John, H. E.; Abla, G.; Collier, A.; Prater, R.; Park, J. M.; Li, G.; Guo, W.; Pan, C.; Srinivasan, R.; Worrall, M.

    2009-11-01

    Recent progress in the development of the IMFIT Integrated modeling tool is presented. The goal of IMFIT is to develop a modern and efficient integrated modeling platform to support DIII-D and EAST research, including the capability to simulate the behavior of tokamak discharges. Recent progress includes user-friendly and Python-based GUIs with multi-links to equilibrium, transport, and stability codes to facilitate modeling and analysis, and EFIT F90 upgrade with dynamic memory allocation and MPI option to support multiple devices and grid sizes. Through the GUI, straightforward analysis for kinetic EFIT reconstruction is made available. Ongoing developments include design of efficient algorithms to support interactions among physics modules such as EFIT/ONETWO/TGLF coupling for scenario development and transport flux analysis and EFIT/PEST3/TORAY coupling for modeling of tearing mode stability. Details will be presented.

  10. The Shape of Dark Matter Haloes II. The Galactus HI Modelling & Fitting Tool

    CERN Document Server

    Peters, S P C; Allen, R J; Freeman, K C

    2016-01-01

    We present a new HI modelling tool called \\textsc{Galactus}. The program has been designed to perform automated fits of disc-galaxy models to observations. It includes a treatment for the self-absorption of the gas. The software has been released into the public domain. We describe the design philosophy and inner workings of the program. After this, we model the face-on galaxy NGC2403, using both self-absorption and optically thin models, showing that self-absorption occurs even in face-on galaxies. It is shown that the maximum surface brightness plateaus seen in Paper I of this series are indeed signs of self-absorption. The apparent HI mass of an edge-on galaxy can be drastically lower compared to that same galaxy seen face-on. The Tully-Fisher relation is found to be relatively free from self-absorption issues.

  11. DEVELOPMENT OF FUZZY MODEL FOR POWDER MIXED ELECTRO DISCHARGE MACHINING USING COPPER AND GRAPHITE TOOL MATERIAL

    Directory of Open Access Journals (Sweden)

    SONI S.S.

    2012-09-01

    Full Text Available This paper describes development of fuzzy logic model for powder mixed electro discharge machining (PMEDM process. The developed fuzzy model implements triangular and trapezoidal membership functionsfor fuzzification and centre-of-area method for defuzzification processes. The process parameters selected as control variables for experimental work were tool material, type of powder, concentration of powder in dielectric medium and peak current. The machining operation was conducted by using copper and graphite as electrode material on mild steel workpiece material. The powder additives used in the experiment were aluminum and silicon because of their significantly different electrical and thermal properties. The dielectric fluid used was kerosene. The response parameters selected are material removal rate and electrode wear rate. Response surfaces are developed from the developed fuzzy system model. Also exemplar plot developed to compare the responses from fuzzy model and experiment.

  12. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    Science.gov (United States)

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  13. Numerical modelling of tools steel hardening. A thermal phenomena and phase transformations

    Directory of Open Access Journals (Sweden)

    T. Domański

    2010-01-01

    Full Text Available This paper the model hardening of tool steel takes into considerations of thermal phenomena and phase transformations in the solid state are presented. In the modelling of thermal phenomena the heat equations transfer has been solved by Finite Elements Method. The graph of continuous heating (CHT and continuous cooling (CCT considered steel are used in the model of phase transformations. Phase altered fractions during the continuous heating austenite and continuous cooling pearlite or bainite are marked in the model by formula Johnson-Mehl and Avrami. For rate of heating >100 K/s the modified equation Koistinen and Marburger is used. Modified equation Koistinen and Marburger identify the forming fraction of martensite.

  14. Clinical Prediction Model and Tool for Assessing Risk of Persistent Pain After Breast Cancer Surgery

    DEFF Research Database (Denmark)

    Meretoja, Tuomo J; Andersen, Kenneth Geving; Bruce, Julie

    2017-01-01

    at 1 year postoperatively were developed by logistic regression analyses in the Finnish patient cohort. The models were tested in two independent cohorts from Denmark and Scotland by assessing the areas under the receiver operating characteristics curves (ROC-AUCs). The outcome variable was moderate......), high body mass index ( P = .039), axillary lymph node dissection ( P = .008), and more severe acute postoperative pain intensity at the seventh postoperative day ( P = .003) predicted persistent pain in the final prediction model, which performed well in the Danish (ROC-AUC, 0.739) and Scottish (ROC......-AUC, 0.740) cohorts. At the 20% risk level, the model had 32.8% and 47.4% sensitivity and 94.4% and 82.4% specificity in the Danish and Scottish cohorts, respectively. Conclusion Our validated prediction models and an online risk calculator provide clinicians and researchers with a simple tool to screen...

  15. Mathematical modelling of migration: A suitable tool for the enforcement authorities?

    DEFF Research Database (Denmark)

    Petersen, Jens Højslev; Trier, Xenia Thorsager; Fabech, B.

    2005-01-01

    A few years ago, it became accepted that the plastics industry could use migration modelling for compliance testing. When a calculation confirms that the migration of a compound from a plastic material or article is below the specific migration limit, this is considered sufficient documentation...... for compliance with legislation. In the case of non-compliance, the result needs to be verified experimentally. The European Commission recommends that the enforcement authorities use migration modelling as well to avoid long and expensive analysis. The aim of the present work was to investigate the practical...... possibilities of implementing migration-modelling software as a tool in official food control and possibly in improving the own-check programmes of Danish plastic-converting plants. Food inspectors from nine regional food control centres initially attended a training course in the use of a commercial modelling...

  16. An improved model for the oPtImal Measurement Probes Allocation tool

    Energy Technology Data Exchange (ETDEWEB)

    Sterle, C., E-mail: claudio.sterle@unina.it [Consorzio CREATE/Dipartimento di Ingegneria Elettrica e delle Tecnologie dell’Informazione, Università degli Studi di Napoli Federico II, Via Claudio 21, 80125 Napoli (Italy); Neto, A.C. [Fusion for Energy, 08019 Barcelona (Spain); De Tommasi, G. [Consorzio CREATE/Dipartimento di Ingegneria Elettrica e delle Tecnologie dell’Informazione, Università degli Studi di Napoli Federico II, Via Claudio 21, 80125 Napoli (Italy)

    2015-10-15

    Highlights: • The problem of optimally allocating the probes of a diagnostic system is tackled. • The problem is decomposed in two consecutive optimization problems. • Two original ILP models are proposed and sequentially solved to optimality. • The proposed ILP models improve and extend the previous work present in literature. • Real size instances have been optimally solved with very low computation time. - Abstract: The oPtImal Measurement Probes Allocation (PIMPA) tool has been recently proposed in [1] to maximize the reliability of a tokamak diagnostic system against the failure of one or more of the processing nodes. PIMPA is based on the solution of integer linear programming (ILP) problems, and it minimizes the effect of the failure of a data acquisition component. The first formulation of the PIMPA model did not support the concept of individual slots. This work presents an improved ILP model that addresses the above mentioned problem, by taking into account all the individual probes.

  17. Modeling of Complexometric Titration Data: Applications and Implications of New Computational Tools and Thermodynamic Data

    Science.gov (United States)

    Hudson, R.; Omanovic, D.; Kogut, M.; Voelker, B. M.

    2016-02-01

    Complexometric titration of natural ligands in seawater using the competitive ligand equilibration-adsorptive cathodic stripping voltammetry method (CLE-AdCSV) is the method of choice for characterizing the organic complexation of Cu and Fe in seawater. Interpreting such titration data is made difficult by the complexity of the modeling process, which arises from the need to estimate non-linear model equations, the potential for artifacts, and the use of reference equilibrium constants that have been subject of only limited study. Due to the need to model multi-component equilibrium systems when these titration data, a variety of approximations have been made in order to allow standard linear and non-linear regression tools to be applied. Two software tools, KINETEQL and ProMCC, solve the model equations exactly and allow users to estimate complexation model parameters accurately. ProMCC excels in visualization and ease-of-use, while KINETEQL provides the user with flexibility in the definition of equilibrium models and has the additional capability of solving reaction kinetics problems. A detailed example of the application of KINETQL to simulating the kinetics of Cu(II) complexation by EDTA in seawater will be illustrated. The implications of kinetics for experimental determination of the stability constants of natural Cu- and Fe-binding ligands will be addressed. These modeling tools make it feasible to design experiments and analyze datasets using new, complex approaches to data analysis, i.e., data from multiple CLE-AdCSV titrations obtained in different analytical windows. This approach can help solve to the problem of internal calibration in waters that contain mixtures of weak and strong ligands. Because it attempts to model data that span a much wider range in chemistries, the "multiwindow" approach is especially vulnerable to bias in the reference complex stability constants. The difficulty of obtaining coherent models of multiwindow CLE-AdCSV datasets

  18. Application of Mathematical Modelling as a Tool to Analyze the EEG Signals in Rat Model of Focal Cerebral Ischemia

    Science.gov (United States)

    Paul, S.; Bhattacharya, P.; Pandey, A. K.; Patnaik, R.

    2014-01-01

    The present paper envisages the application of mathematical modelling with the autoregressive (AR) model method as a tool to analyze electroencephalogram data in rat subjects of transient focal cerebral ischemia. This modelling method was used to determine the frequencies and characteristic changes in brain waveforms which occur as a result of disorders or fluctuating physiological states. This method of analysis was utilized to ensure actual correlation of the different mathematical paradigms. The EEG data was obtained from different regions of the rat brain and was modelled by AR method in a MATLAB platform. AR modelling was utilized to study the long-term functional outcomes of a stroke and also is preferable for EEG signal analysis because the signals consist of discrete frequency intervals. Modern spectral analysis, namely AR spectrum analysis, was used to correlate the conditional and prevalent changes in brain function in response to a stroke.

  19. An Interactive Tool for Analysis and Optimization of Texture Parameters in Photorealistic Virtual 3d Models

    Science.gov (United States)

    Sima, A. A.; Buckley, S. J.; Viola, I.

    2012-07-01

    Texture mapping is a common method for combining surface geometry with image data, with the resulting photorealistic 3D models being suitable not only for visualization purposes but also for interpretation and spatiameasurement, in many application fields, such as cultural heritage and the earth sciences. When acquiring images for creation of photorealistic models, it is usual to collect more data than is finally necessary for the texturing process. Images may be collected from multiple locations, sometimes with different cameras or lens configurations and large amounts of overlap may exist. Consequently, much redundancy may be present, requiring sorting to choose the most suitable images to texture the model triangles. This paper presents a framework for visualization and analysis of the geometric relations between triangles of the terrain model and covering image sets. The application provides decision support for selection of an image subset optimized for 3D model texturing purposes, for non-specialists. It aims to improve the communication of geometrical dependencies between model triangles and the available digital images, through the use of static and interactive information visualization methods. The tool was used for computer-aided selection of image subsets optimized for texturing of 3D geological outcrop models. The resulting textured models were of high quality and with a minimum of missing texture, and the time spent in time-consuming reprocessing was reduced. Anecdotal evidence indicated that an increased user confidence in the final textured model quality and completeness makes the framework highly beneficial.

  20. Modeling and Adhesive Tool Wear in Dry Drilling of Aluminum Alloys

    Science.gov (United States)

    Girot, F.; Gutiérrez-Orrantia, M. E.; Calamaz, M.; Coupard, D.

    2011-01-01

    One of the challenges in aeronautic drilling operations is the elimination of cutting fluids while maintaining the quality of drilled parts. This paper therefore aims to increase the tool life and process quality by working on relationships existing between drilling parameters (cutting speed and feed rate), coatings and tool geometry. In dry drilling, the phenomenon of Built-Up Layer is the predominant damage mechanism. A model fitting the axial force with the cutting parameters and the damage has been developed. The burr thickness and its dispersion decrease with the feed rate. The current diamond coatings which exhibit a strong adhesion to the carbide substrate can limit this adhesive layer phenomenon. A relatively smooth nano-structured coating strongly limits the development of this layer.

  1. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  2. Development of modelling method selection tool for health services management: From problem structuring methods to modelling and simulation methods

    Directory of Open Access Journals (Sweden)

    Naseer Aisha

    2011-05-01

    Full Text Available Abstract Background There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. Aim The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. Methods This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data. Results The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. Conclusions A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  3. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    Science.gov (United States)

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  4. SPRINT: A Tool to Generate Concurrent Transaction-Level Models from Sequential Code

    Directory of Open Access Journals (Sweden)

    Cockx Johan

    2007-01-01

    Full Text Available A high-level concurrent model such as a SystemC transaction-level model can provide early feedback during the exploration of implementation alternatives for state-of-the-art signal processing applications like video codecs on a multiprocessor platform. However, the creation of such a model starting from sequential code is a time-consuming and error-prone task. It is typically done only once, if at all, for a given design. This lack of exploration of the design space often leads to a suboptimal implementation. To support our systematic C-based design flow, we have developed a tool to generate a concurrent SystemC transaction-level model for user-selected task boundaries. Using this tool, different parallelization alternatives have been evaluated during the design of an MPEG-4 simple profile encoder and an embedded zero-tree coder. Generation plus evaluation of an alternative was possible in less than six minutes. This is fast enough to allow extensive exploration of the design space.

  5. SPRINT: A Tool to Generate Concurrent Transaction-Level Models from Sequential Code

    Directory of Open Access Journals (Sweden)

    Richard Stahl

    2007-01-01

    Full Text Available A high-level concurrent model such as a SystemC transaction-level model can provide early feedback during the exploration of implementation alternatives for state-of-the-art signal processing applications like video codecs on a multiprocessor platform. However, the creation of such a model starting from sequential code is a time-consuming and error-prone task. It is typically done only once, if at all, for a given design. This lack of exploration of the design space often leads to a suboptimal implementation. To support our systematic C-based design flow, we have developed a tool to generate a concurrent SystemC transaction-level model for user-selected task boundaries. Using this tool, different parallelization alternatives have been evaluated during the design of an MPEG-4 simple profile encoder and an embedded zero-tree coder. Generation plus evaluation of an alternative was possible in less than six minutes. This is fast enough to allow extensive exploration of the design space.

  6. Model-based reasoning: using visual tools to reveal student learning.

    Science.gov (United States)

    Luckie, Douglas; Harrison, Scott H; Ebert-May, Diane

    2011-03-01

    Using visual models is common in science and should become more common in classrooms. Our research group has developed and completed studies on the use of a visual modeling tool, the Concept Connector. This modeling tool consists of an online concept mapping Java applet that has automatic scoring functions we refer to as Robograder. The Concept Connector enables students in large introductory science courses to visualize their thinking through online model building. The Concept Connector's flexible scoring system, based on tested grading schemes as well as instructor input, has enabled >1,000 physiology students to build maps of their ideas about plant and animal physiology with the guidance of automatic and immediate online scoring of homework. Criterion concept maps developed by instructors in this project contain numerous expert-generated or "correct" propositions connecting two concept words together with a linking phrase. In this study, holistic algorithms were used to test automated methods of scoring concept maps that might work as well as a human grader.

  7. ADOPT: A tool for automatic detection of tectonic plates at the surface of convection models

    Science.gov (United States)

    Mallard, C.; Jacquet, B.; Coltice, N.

    2017-08-01

    Mantle convection models with plate-like behavior produce surface structures comparable to Earth's plate boundaries. However, analyzing those structures is a difficult task, since convection models produce, as on Earth, diffuse deformation and elusive plate boundaries. Therefore we present here and share a quantitative tool to identify plate boundaries and produce plate polygon layouts from results of numerical models of convection: Automatic Detection Of Plate Tectonics (ADOPT). This digital tool operates within the free open-source visualization software Paraview. It is based on image segmentation techniques to detect objects. The fundamental algorithm used in ADOPT is the watershed transform. We transform the output of convection models into a topographic map, the crest lines being the regions of deformation (plate boundaries) and the catchment basins being the plate interiors. We propose two generic protocols (the field and the distance methods) that we test against an independent visual detection of plate polygons. We show that ADOPT is effective to identify the smaller plates and to close plate polygons in areas where boundaries are diffuse or elusive. ADOPT allows the export of plate polygons in the standard OGR-GMT format for visualization, modification, and analysis under generic softwares like GMT or GPlates.

  8. System Solver: an open source tool for mathematically modelling dynamical systems

    Directory of Open Access Journals (Sweden)

    Efraín Domínguez

    2010-10-01

    Full Text Available The following paper presents a freeware modelling tool simulating dynamic systems that can be represented by either an ordinary differential equation (ODE or a set of differential equations of different orders. The main idea leading to this software development is related to the fact that many physiccal, biological, ecological, economical, chemical, social and engineering problems can be expressed in this way. Furthermore, the solution to these problems requires some expertise in numerical methods and programming. Such knowledge is uncommon in some of the experts in such scientific domains. A tool to fill in this knowledge gap, increase productivity within modelling-related research and support the teaching of mathematical modelling topics is thus needed. This paper introduces System Solver, a computer application that facilitates the formulation of initial value problems for ODE systems, numerically solves these problems and provides a user with not only the solution but also debugged Visual Basic source code for the application. The obtained code can be easily shared among researchers, which facilitates the replication of numerical experiments even across different operating systems. This software’s introduction is accompanied by examples from different domains, including one example from stochastic modelling.

  9. A Modeling Tool for Evaluating Climate Change Impacts on Water Supply System

    Science.gov (United States)

    Chuang, L.; Tung, C.; Liu, T.

    2009-12-01

    Climate change may exacerbate short-term climate variability and more extreme hydrological events, and then may impact on human society and natural environment. Socioeconomic development is dependent on adequate water resources, but climate change may impact on such supply system, including available streamflow, groundwater, irrigation water demand. The purpose of this study is to apply an integrated modeling tool to assess the climate change impacts on regional water supply systems and then to propose response strategies to strengthen adaptive capacity to achieve sustainable water uses. The modeling tool integrates the functions of downscaling, weather generation, hydrological modeling, and an interface for linking system dynamics models. The Touchien river basin in Taiwan is chosen as a study area, which has a high-tech industry park. The vulnerability of the water supply system was evaluated for present and future conditions. The results demonstrated that the water supply system could meet current water demand, but might be subjected to serious water shortage due to future climate change and increasing water demand. At last, this study provides suggestions to government agency to develop better water resources management strategies to mitigate the impacts of changing climate.

  10. The use of the articulated total body model as a robot dynamics simulation tool

    Science.gov (United States)

    Obergfell, Louise A.; Avula, Xavier J. R.; Kalegs, Ints

    1988-01-01

    The Articulated Total Body (ATB) model is a computer sumulation program which was originally developed for the study of aircrew member dynamics during ejection from high-speed aircraft. This model is totally three-dimensional and is based on the rigid body dynamics of coupled systems which use Euler's equations of motion with constraint relations of the type employed in the Lagrange method. In this paper the use of the ATB model as a robot dynamics simulation tool is discussed and various simulations are demonstrated. For this purpose the ATB model has been modified to allow for the application of torques at the joints as functions of state variables of the system. Specifically, the motion of a robotic arm with six revolute articulations with joint torques prescribed as functions of angular displacement and angular velocity are demonstrated. The simulation procedures developed in this work may serve as valuable tools for analyzing robotic mechanisms, dynamic effects, joint load transmissions, feed-back control algorithms employed in the actuator control and end-effector trajectories.

  11. Coarse-grained models of protein folding: toy models or predictive tools?

    Science.gov (United States)

    Clementi, Cecilia

    2008-02-01

    Coarse-grained models are emerging as a practical alternative to all-atom simulations for the characterization of protein folding mechanisms over long time scales. While a decade ago minimalist toy models were mainly designed to test general hypotheses on the principles regulating protein folding, the latest coarse-grained models are increasingly realistic and can be used to characterize quantitatively the detailed folding mechanism of specific proteins. The ability of such models to reproduce the essential features of folding dynamics suggests that each single atomic degree of freedom is not by itself particularly relevant to folding and supports a statistical mechanical approach to characterize folding transitions. When combined with more refined models and with experimental studies, the systematic investigation of protein systems and complexes using coarse-grained models can advance our theoretical understanding of the actual organizing principles that emerge from the complex network of interactions among protein atomic constituents.

  12. Modeling and Control of the Cobelli Model as a Personalized Prescriptive Tool for Diabetes Treatment

    Science.gov (United States)

    2016-11-05

    utilized with Model Predictive Control (MPC) to adequately choose medications for future dosing based on physiological accuracy and convenience . A proof...utilized with MPC to adequately choose medications for future dosing based on physiological accuracy and convenience . A proof of concept demonstrated the...iteration of the model is complete or the simulation is ended. Figure 4: Model Predictive Control Algorithm with Time Sample Visual [4] Gplasma

  13. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  14. An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models

    Science.gov (United States)

    Zaitchik, B. F.; Berhane, F.; Tadesse, T.

    2015-12-01

    We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS

  15. Python tools for rapid development, calibration, and analysis of generalized groundwater-flow models

    Science.gov (United States)

    Starn, J. J.; Belitz, K.

    2014-12-01

    National-scale water-quality data sets for the United States have been available for several decades; however, groundwater models to interpret these data are available for only a small percentage of the country. Generalized models may be adequate to explain and project groundwater-quality trends at the national scale by using regional scale models (defined as watersheds at or between the HUC-6 and HUC-8 levels). Coast-to-coast data such as the National Hydrologic Dataset Plus (NHD+) make it possible to extract the basic building blocks for a model anywhere in the country. IPython notebooks have been developed to automate the creation of generalized groundwater-flow models from the NHD+. The notebook format allows rapid testing of methods for model creation, calibration, and analysis. Capabilities within the Python ecosystem greatly speed up the development and testing of algorithms. GeoPandas is used for very efficient geospatial processing. Raster processing includes the Geospatial Data Abstraction Library and image processing tools. Model creation is made possible through Flopy, a versatile input and output writer for several MODFLOW-based flow and transport model codes. Interpolation, integration, and map plotting included in the standard Python tool stack also are used, making the notebook a comprehensive platform within on to build and evaluate general models. Models with alternative boundary conditions, number of layers, and cell spacing can be tested against one another and evaluated by using water-quality data. Novel calibration criteria were developed by comparing modeled heads to land-surface and surface-water elevations. Information, such as predicted age distributions, can be extracted from general models and tested for its ability to explain water-quality trends. Groundwater ages then can be correlated with horizontal and vertical hydrologic position, a relation that can be used for statistical assessment of likely groundwater-quality conditions

  16. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    Science.gov (United States)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  17. Advanced semi-active engine and transmission mounts: tools for modelling, analysis, design, and tuning

    Science.gov (United States)

    Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy

    2014-02-01

    This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.

  18. MatVPC: A User-Friendly MATLAB-Based Tool for the Simulation and Evaluation of Systems Pharmacology Models.

    Science.gov (United States)

    Biliouris, K; Lavielle, M; Trame, M N

    2015-09-01

    Quantitative systems pharmacology (QSP) models are progressively entering the arena of contemporary pharmacology. The efficient implementation and evaluation of complex QSP models necessitates the development of flexible computational tools that are built into QSP mainstream software. To this end, we present MatVPC, a versatile MATLAB-based tool that accommodates QSP models of any complexity level. MatVPC executes Monte Carlo simulations as well as automatic construction of visual predictive checks (VPCs) and quantified VPCs (QVPCs).

  19. Modelling of stratification in cryogenic launch vehicle tanks in a fast engineering tool

    Science.gov (United States)

    van Foreest, Arnold

    Modelling of stratification in cryogenic launch vehicle tanks in a fast engineering tool Thermal stratification in cryogenic launch vehicle tanks can lead to several problems, such as sudden pressure drops in the tank due to sloshing of the stratified liquid or cavitation in rocket engine turbopumps. To obtain an optimal stage design, the stratification process muss be taken into account. Currently, stratification is often modelled by 3D CFD solvers, which is an extremely time consuming process. Analytical models do exists but are inaccurate. This paper will show how the currently existing analytical models are improved, by using experimental data and results obtained from numerical calculations using the 3D CFD tool FLOW 3D. The goal is to be able to model a stratification process of a few hundred seconds in just a few seconds of CPU time, so about a factor 100 faster than the physical process takes. A simulation using a 3D flow solver can take multiple days. Setting up the model for a 3D flow solver can even take longer. Therefore it would be a big advantage to have fast engineering tools describing the process so that stratification can be taken into account in the preliminary design phase. The stratification process has been investigated experimentally at ZARM (Centre of Applied Spaceflight and Microgravity), using a closed tank filled with liquid nitrogen. Due to unavoidable heat leaks from the surrounding, the liquid will start to heat up and thermal layers will form. The experiments are simulated using the commercial 3D flow solver "FLOW 3D". Once satisfying numerical results have been obtained, the stratification process can be investigated in more detail. The dimensioning parameters can be determined and their influence can be quantified. From these analyses it has been found that for example heat conduction through the tank wall in tangential direction has a big impact on the formation of thermal layers. The currently available analytical models for

  20. A flexible, interactive software tool for fitting the parameters of neuronal models

    Directory of Open Access Journals (Sweden)

    Péter eFriedrich

    2014-07-01

    Full Text Available The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problem of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting

  1. Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system.

    Science.gov (United States)

    Brüderle, Daniel; Müller, Eric; Davison, Andrew; Muller, Eilif; Schemmel, Johannes; Meier, Karlheinz

    2009-01-01

    Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated.

  2. Social network modeling: a powerful tool for the study of group scale phenomena in primates.

    Science.gov (United States)

    Jacobs, Armand; Petit, Odile

    2011-08-01

    Social Network Analysis is now a valuable tool to study social complexity in many animal species, including primates. However, this framework has rarely been used to implement quantitative data on the social structure of a group within computer models. Such approaches allow the investigation of how social organization constrains other traits and also how these traits can impact the social organization in return. In this commentary, we discuss the powerful potential of social network modeling as a way to study group scale phenomena in primates. We describe the advantages of using such a method and we focus on the specificity of this approach in primates, given the particularities of their social networks compared with those of other taxa. We also give practical considerations and a list of examples as for the choice of parameters that can be used to implement the social layer within the models.

  3. Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system

    Directory of Open Access Journals (Sweden)

    Daniel Brüderle

    2009-06-01

    Full Text Available Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated.

  4. Bayesian network as a modelling tool for risk management in agriculture

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Madsen, Anders L.; Lund, Mogens

    . In this paper we use Bayesian networks as an integrated modelling approach for representing uncertainty and analysing risk management in agriculture. It is shown how historical farm account data may be efficiently used to estimate conditional probabilities, which are the core elements in Bayesian network models....... We further show how the Bayesian network model RiBay is used for stochastic simulation of farm income, and we demonstrate how RiBay can be used to simulate risk management at the farm level. It is concluded that the key strength of a Bayesian network is the transparency of assumptions......The importance of risk management increases as farmers become more exposed to risk. But risk management is a difficult topic because income risk is the result of the complex interaction of multiple risk factors combined with the effect of an increasing array of possible risk management tools...

  5. Validation and Use of a Predictive Modeling Tool: Employing Scientific Findings to Improve Responsible Conduct of Research Education.

    Science.gov (United States)

    Mulhearn, Tyler J; Watts, Logan L; Todd, E Michelle; Medeiros, Kelsey E; Connelly, Shane; Mumford, Michael D

    2017-01-01

    Although recent evidence suggests ethics education can be effective, the nature of specific training programs, and their effectiveness, varies considerably. Building on a recent path modeling effort, the present study developed and validated a predictive modeling tool for responsible conduct of research education. The predictive modeling tool allows users to enter ratings in relation to a given ethics training program and receive instantaneous evaluative information for course refinement. Validation work suggests the tool's predicted outcomes correlate strongly (r = 0.46) with objective course outcomes. Implications for training program development and refinement are discussed.

  6. Hybrid Neural Network Approach Based Tool for the Modelling of Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Antonino Laudani

    2015-01-01

    Full Text Available A hybrid neural network approach based tool for identifying the photovoltaic one-diode model is presented. The generalization capabilities of neural networks are used together with the robustness of the reduced form of one-diode model. Indeed, from the studies performed by the authors and the works present in the literature, it was found that a direct computation of the five parameters via multiple inputs and multiple outputs neural network is a very difficult task. The reduced form consists in a series of explicit formulae for the support to the neural network that, in our case, is aimed at predicting just two parameters among the five ones identifying the model: the other three parameters are computed by reduced form. The present hybrid approach is efficient from the computational cost point of view and accurate in the estimation of the five parameters. It constitutes a complete and extremely easy tool suitable to be implemented in a microcontroller based architecture. Validations are made on about 10000 PV panels belonging to the California Energy Commission database.

  7. DESTINY: A Comprehensive Tool with 3D and Multi-Level Cell Memory Modeling Capability

    Directory of Open Access Journals (Sweden)

    Sparsh Mittal

    2017-09-01

    Full Text Available To enable the design of large capacity memory structures, novel memory technologies such as non-volatile memory (NVM and novel fabrication approaches, e.g., 3D stacking and multi-level cell (MLC design have been explored. The existing modeling tools, however, cover only a few memory technologies, technology nodes and fabrication approaches. We present DESTINY, a tool for modeling 2D/3D memories designed using SRAM, resistive RAM (ReRAM, spin transfer torque RAM (STT-RAM, phase change RAM (PCM and embedded DRAM (eDRAM and 2D memories designed using spin orbit torque RAM (SOT-RAM, domain wall memory (DWM and Flash memory. In addition to single-level cell (SLC designs for all of these memories, DESTINY also supports modeling MLC designs for NVMs. We have extensively validated DESTINY against commercial and research prototypes of these memories. DESTINY is very useful for performing design-space exploration across several dimensions, such as optimizing for a target (e.g., latency, area or energy-delay product for a given memory technology, choosing the suitable memory technology or fabrication method (i.e., 2D v/s 3D for a given optimization target, etc. We believe that DESTINY will boost studies of next-generation memory architectures used in systems ranging from mobile devices to extreme-scale supercomputers. The latest source-code of DESTINY is available from the following git repository: https://bitbucket.org/sparshmittal/destinyv2.

  8. Sampling and sensitivity analyses tools (SaSAT for computational modelling

    Directory of Open Access Journals (Sweden)

    Wilson David P

    2008-02-01

    Full Text Available Abstract SaSAT (Sampling and Sensitivity Analysis Tools is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated.

  9. Fourier transform based dynamic error modeling method for ultra-precision machine tool

    Science.gov (United States)

    Chen, Guoda; Liang, Yingchun; Ehmann, Kornel F.; Sun, Yazhou; Bai, Qingshun

    2014-08-01

    In some industrial fields, the workpiece surface need to meet not only the demand of surface roughness, but the strict requirement of multi-scale frequency domain errors. Ultra-precision machine tool is the most important carrier for the ultra-precision machining of the parts, whose errors is the key factor to influence the multi-scale frequency domain errors of the machined surface. The volumetric error modeling is the important bridge to link the relationship between the machine error and machined surface error. However, the available error modeling method from the previous research is hard to use to analyze the relationship between the dynamic errors of the machine motion components and multi-scale frequency domain errors of the machined surface, which plays the important reference role in the design and accuracy improvement of the ultra-precision machine tool. In this paper, a fourier transform based dynamic error modeling method is presented, which is also on the theoretical basis of rigid body kinematics and homogeneous transformation matrix. A case study is carried out, which shows the proposed method can successfully realize the identical and regular numerical description of the machine dynamic errors and the volumetric errors. The proposed method has strong potential for the prediction of the frequency domain errors on the machined surface, extracting of the information of multi-scale frequency domain errors, and analysis of the relationship between the machine motion components and frequency domain errors of the machined surface.

  10. The smoke-fireplume model : tool for eventual application to prescribed burns and wildland fires.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D. F.; Dunn, W. E.; Lazaro, M. A.; Policastro, A. J.

    1999-08-17

    Land managers are increasingly implementing strategies that employ the use of fire in prescribed burns to sustain ecosystems and plan to sustain the rate of increase in its use over the next five years. In planning and executing expanded use of fire in wildland treatment it is important to estimate the human health and safety consequences, property damage, and the extent of visibility degradation from the resulting conflagration-pyrolysis gases, soot and smoke generated during flaming, smoldering and/or glowing fires. Traditional approaches have often employed the analysis of weather observations and forecasts to determine whether a prescribed burn will affect populations, property, or protected Class I areas. However, the complexity of the problem lends itself to advanced PC-based models that are simple to use for both calculating the emissions from the burning of wildland fuels and the downwind dispersion of smoke and other products of pyrolysis, distillation, and/or fuels combustion. These models will need to address the effects of residual smoldering combustion, including plume dynamics and optical effects. In this paper, we discuss a suite of tools that can be applied for analyzing dispersion. These tools include the dispersion models FIREPLUME and SMOKE, together with the meteorological preprocessor SEBMET.

  11. Sampling and sensitivity analyses tools (SaSAT) for computational modelling.

    Science.gov (United States)

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-02-27

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab, a numerical mathematical software package, and utilises algorithms contained in the Matlab Statistics Toolbox. However, Matlab is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated.

  12. Cost Benefit Analysis Modeling Tool for Electric vs. ICE Airport Ground Support Equipment – Development and Results

    Energy Technology Data Exchange (ETDEWEB)

    James Francfort; Kevin Morrow; Dimitri Hochard

    2007-02-01

    This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.

  13. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  14. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    Science.gov (United States)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  15. Reticella: An Execution Trace Slicing and Visualization Tool Based on a Behavior Model

    Science.gov (United States)

    Noda, Kunihiro; Kobayashi, Takashi; Yamamoto, Shinichiro; Saeki, Motoshi; Agusa, Kiyoshi

    Program comprehension using dynamic information is one of key tasks of software maintenance. Software visualization with sequence diagrams is a promising technique to help developer comprehend the behavior of object-oriented systems effectively. There are many tools that can support automatic generation of a sequence diagram from execution traces. However it is still difficult to understand the behavior because the size of automatically generated sequence diagrams from the massive amounts of execution traces tends to be beyond developer's capacity. In this paper, we propose an execution trace slicing and visualization method. Our proposed method is capable of slice calculation based on a behavior model which can treat dependencies based on static and dynamic analysis and supports for various programs including exceptions and multi-threading. We also introduce our tool that perform our proposed slice calculation on the Eclipse platform. We show the applicability of our proposed method by applying the tool to two Java programs as case studies. As a result, we confirm effectiveness of our proposed method for understanding the behavior of object-oriented systems.

  16. N2A: a computational tool for modeling from neurons to algorithms.

    Science.gov (United States)

    Rothganger, Fredrick; Warrender, Christina E; Trumbo, Derek; Aimone, James B

    2014-01-01

    The exponential increase in available neural data has combined with the exponential growth in computing ("Moore's law") to create new opportunities to understand neural systems at large scale and high detail. The ability to produce large and sophisticated simulations has introduced unique challenges to neuroscientists. Computational models in neuroscience are increasingly broad efforts, often involving the collaboration of experts in different domains. Furthermore, the size and detail of models have grown to levels for which understanding the implications of variability and assumptions is no longer trivial. Here, we introduce the model design platform N2A which aims to facilitate the design and validation of biologically realistic models. N2A uses a hierarchical representation of neural information to enable the integration of models from different users. N2A streamlines computational validation of a model by natively implementing standard tools in sensitivity analysis and uncertainty quantification. The part-relationship representation allows both network-level analysis and dynamical simulations. We will demonstrate how N2A can be used in a range of examples, including a simple Hodgkin-Huxley cable model, basic parameter sensitivity of an 80/20 network, and the expression of the structural plasticity of a growing dendrite and stem cell proliferation and differentiation.

  17. N2A: a computational tool for modeling from neurons to algorithms

    Directory of Open Access Journals (Sweden)

    Fredrick eRothganger

    2014-01-01

    Full Text Available The exponential increase in available neural data has combined with the exponential growth in computing (Moore’s law to create new opportunities to understand neural systems at large scale and high detail. The ability to produce large and sophisticated simulations has introduced unique challenges to neuroscientists. Computational models in neuroscience are increasingly broad efforts, often involving the collaboration of experts in different domains. Furthermore, the size and detail of models have grown to levels for which understanding the implications of variability and assumptions is no longer trivial. Here, we introduce the model design platform N2A which aims to facilitate the design and validation of biologically realistic models. N2A uses a hierarchical representation of neural information to enable the integration of models from different users. N2A streamlines computational validation of a model by natively implementing standard tools in sensitivity analysis and uncertainty quantification. The part-relationship representation allows both network-level analysis and dynamical simulations. We will demonstrate how N2A can be used in a range of examples, including a simple Hodgkin-Huxley cable model, basic parameter sensitivity of an 80/20 network, and the expression of the structural plasticity of a growing dendrite and stem cell proliferation and differentiation.

  18. An Effective Modeling Approach for High Efficient Solar Cell Using Virtual Wafer Fabrication Tools

    Directory of Open Access Journals (Sweden)

    Khomdram Jolson Singh

    2011-01-01

    Full Text Available n order to give a real understanding and realization of all the phenomena occurring inside the photovoltaic cell devices, the development of a reliable simulated model first is also essential. In this paper, a novel method for developing a realistic model of an efficient solar cell is presented. An efficient model of a Dual Junction InGaP/GaAs solar cell having GaAs tunnel diode is prepared and fully simulated using Silvaco VWF/ATLAS code. An optimization of window layer, ARC, BSF etc are also performed incorporating the effect of some of the different parameters on the performance of this model. The major stages of the process are explained and the simulation results are compared with published experimental data to demonstrate the accuracy of our results produced by the model utilizing this technique. For this optimized InGaP/GaAs Dual Junction cell model having 125 nm DLAR on 18 nm InAlP textured window with effective 500 nm InAlGaP bottom BSF , a maximum conversion efficiency of 32.20 % (1 sun and 36.67 % (1000 suns is obtained under AM1.5G illumination. The introduction of this modeling technique to the photovoltaic community will prove to be of great importance in aiding in the design and development of advanced solar cells using Silvaco Virtual Wafer Fabrication Tools.

  19. Modeling of the flow stress for AISI H13 Tool Steel during Hard Machining Processes

    Science.gov (United States)

    Umbrello, Domenico; Rizzuti, Stefania; Outeiro, José C.; Shivpuri, Rajiv

    2007-04-01

    In general, the flow stress models used in computer simulation of machining processes are a function of effective strain, effective strain rate and temperature developed during the cutting process. However, these models do not adequately describe the material behavior in hard machining, where a range of material hardness between 45 and 60 HRC are used. Thus, depending on the specific material hardness different material models must be used in modeling the cutting process. This paper describes the development of a hardness-based flow stress and fracture models for the AISI H13 tool steel, which can be applied for range of material hardness mentioned above. These models were implemented in a non-isothermal viscoplastic numerical model to simulate the machining process for AISI H13 with various hardness values and applying different cutting regime parameters. Predicted results are validated by comparing them with experimental results found in the literature. They are found to predict reasonably well the cutting forces as well as the change in chip morphology from continuous to segmented chip as the material hardness change.

  20. Multiscale modeling of brain dynamics: from single neurons and networks to mathematical tools.

    Science.gov (United States)

    Siettos, Constantinos; Starke, Jens

    2016-09-01

    The extreme complexity of the brain naturally requires mathematical modeling approaches on a large variety of scales; the spectrum ranges from single neuron dynamics over the behavior of groups of neurons to neuronal network activity. Thus, the connection between the microscopic scale (single neuron activity) to macroscopic behavior (emergent behavior of the collective dynamics) and vice versa is a key to understand the brain in its complexity. In this work, we attempt a review of a wide range of approaches, ranging from the modeling of single neuron dynamics to machine learning. The models include biophysical as well as data-driven phenomenological models. The discussed models include Hodgkin-Huxley, FitzHugh-Nagumo, coupled oscillators (Kuramoto oscillators, Rössler oscillators, and the Hindmarsh-Rose neuron), Integrate and Fire, networks of neurons, and neural field equations. In addition to the mathematical models, important mathematical methods in multiscale modeling and reconstruction of the causal connectivity are sketched. The methods include linear and nonlinear tools from statistics, data analysis, and time series analysis up to differential equations, dynamical systems, and bifurcation theory, including Granger causal connectivity analysis, phase synchronization connectivity analysis, principal component analysis (PCA), independent component analysis (ICA), and manifold learning algorithms such as ISOMAP, and diffusion maps and equation-free techniques. WIREs Syst Biol Med 2016, 8:438-458. doi: 10.1002/wsbm.1348 For further resources related to this article, please visit the WIREs website.