WorldWideScience

Sample records for modelling tool websim-milq

  1. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs.

  2. Green Infrastructure Modeling Tools

    Science.gov (United States)

    Modeling tools support planning and design decisions on a range of scales from setting a green infrastructure target for an entire watershed to designing a green infrastructure practice for a particular site.

  3. Population Density Modeling Tool

    Science.gov (United States)

    2012-06-26

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE 26

  4. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1998-01-01

    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  5. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    the development process. In this report, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from a case study: the modeling tool shall show all decisions related to a model and allow its users to extend or update them; the decision management tool shall......Numerous design decisions are made while developing software systems, which influence the architecture of these systems as well as following decisions. A number of decision management tools already exist for capturing, documenting, and maintaining design decisions, but also for guiding developers...... trigger the modeling tool to realize design decisions in the models. We define tool-independent concepts and architecture building blocks supporting these use cases and present how they can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless...

  6. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  7. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels, and i...

  9. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  10. ANSYS tools in modeling tires

    Science.gov (United States)

    Ali, Ashraf; Lovell, Michael

    1995-01-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  11. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  12. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  13. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  14. Graphical tools for model selection in generalized linear models.

    Science.gov (United States)

    Murray, K; Heritier, S; Müller, S

    2013-11-10

    Model selection techniques have existed for many years; however, to date, simple, clear and effective methods of visualising the model building process are sparse. This article describes graphical methods that assist in the selection of models and comparison of many different selection criteria. Specifically, we describe for logistic regression, how to visualize measures of description loss and of model complexity to facilitate the model selection dilemma. We advocate the use of the bootstrap to assess the stability of selected models and to enhance our graphical tools. We demonstrate which variables are important using variable inclusion plots and show that these can be invaluable plots for the model building process. We show with two case studies how these proposed tools are useful to learn more about important variables in the data and how these tools can assist the understanding of the model building process. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  16. A tool box for implementing supersymmetric models

    Science.gov (United States)

    Staub, Florian; Ohl, Thorsten; Porod, Werner; Speckner, Christian

    2012-10-01

    We present a framework for performing a comprehensive analysis of a large class of supersymmetric models, including spectrum calculation, dark matter studies and collider phenomenology. To this end, the respective model is defined in an easy and straightforward way using the Mathematica package SARAH. SARAH then generates model files for CalcHep which can be used with micrOMEGAs as well as model files for WHIZARD and O'Mega. In addition, Fortran source code for SPheno is created which facilitates the determination of the particle spectrum using two-loop renormalization group equations and one-loop corrections to the masses. As an additional feature, the generated SPheno code can write out input files suitable for use with HiggsBounds to apply bounds coming from the Higgs searches to the model. Combining all programs provides a closed chain from model building to phenomenology. Program summary Program title: SUSY Phenomenology toolbox. Catalog identifier: AEMN_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMN_v1_0.html. Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html. No. of lines in distributed program, including test data, etc.: 140206. No. of bytes in distributed program, including test data, etc.: 1319681. Distribution format: tar.gz. Programming language: Autoconf, Mathematica. Computer: PC running Linux, Mac. Operating system: Linux, Mac OS. Classification: 11.6. Nature of problem: Comprehensive studies of supersymmetric models beyond the MSSM is considerably complicated by the number of different tasks that have to be accomplished, including the calculation of the mass spectrum and the implementation of the model into tools for performing collider studies, calculating the dark matter density and checking the compatibility with existing collider bounds (in particular, from the Higgs searches). Solution method: The

  17. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  18. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  19. New tools for generation IV assemblies modelling

    International Nuclear Information System (INIS)

    Sylvie Aniel-Buchheit; Edwige Richebois

    2005-01-01

    Full text of publication follows: In the framework of the development of generation IV concepts, the need of new assembly modelling tools arises. These concepts present more geometrical and spectral heterogeneities (radially and axially). Moreover thermal-hydraulics and neutronics aspects are so closely related that coupled computations are necessary. That raises the need for more precise and flexible tools presenting 3D features. The 3D-coupling of the thermal-hydraulic code FLICA4 with the Monte-Carlo neutronics code TRIPOLI4 was developed in that frame. This new tool enables for the first time to obtain realistic axial and radial power profiles with real feedback effects in an assembly where thermal-hydraulics and neutronics effects are closely related. The BWR is the existing concept presenting the closest heterogeneous characteristics to the various new proposed concepts. This assembly design is thus chosen to compare this new tool, presenting real 3D characteristics, to the existing ones. For design studies, the evaluation of the assembly behavior, currently necessitate a depletion scheme using a 3D thermal-hydraulics assembly calculation coupled with a 1D axial neutronics deterministic calculation (or an axial power profile chosen as a function of the assembly averaged burn-up). The 3D neutronics code (CRONOS2) uses neutronic data built by 2D deterministic assembly calculations without feedback. These cross section libraries enable to take feedbacks into account via parameters such as fuel temperature, moderator density and temperature (history parameters such as void and control rod are not useful in design evaluation). Recently, the libraries build-up has been replaced by on line multi-2D deterministic assembly calculations performed by a cell code (APOLLO2). That avoids interpolation between pre-determined parameters in the cross-section data used by the 1D axial neutronics calculation and enable to give a radial power map to the 3D thermal

  20. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  1. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  2. A pandemic influenza modeling and visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Maciejewski, Ross; Livengood, Philip; Rudolph, Stephen; Collins, Timothy F.; Ebert, David S.; Brigantic, Robert T.; Corley, Courtney D.; Muller, George A.; Sanders, Stephen W.

    2011-08-01

    The National Strategy for Pandemic Influenza outlines a plan for community response to a potential pandemic. In this outline, state and local communities are charged with enhancing their preparedness. In order to help public health officials better understand these charges, we have developed a modeling and visualization toolkit (PanViz) for analyzing the effect of decision measures implemented during a simulated pandemic influenza scenario. Spread vectors based on the point of origin and distance traveled over time are calculated and the factors of age distribution and population density are taken into effect. Healthcare officials are able to explore the effects of the pandemic on the population through a spatiotemporal view, moving forward and backward through time and inserting decision points at various days to determine the impact. Linked statistical displays are also shown, providing county level summaries of data in terms of the number of sick, hospitalized and dead as a result of the outbreak. Currently, this tool has been deployed in Indiana State Department of Health planning and preparedness exercises, and as an educational tool for demonstrating the impact of social distancing strategies during the recent H1N1 (swine flu) outbreak.

  3. Multidisciplinary Modelling Tools for Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad

    package, e.g. power module, DFR approach meets trade-offs in electrical, thermal and mechanical design of the device. Today, virtual prototyping of power electronic circuits using advanced simulation tools is becoming attractive due to cost/time saving in building potential designs. With simulations......This thesis presents multidisciplinary modelling techniques in a Design For Reliability (DFR) approach for power electronic circuits. With increasing penetration of renewable energy systems, the demand for reliable power conversion systems is becoming critical. Since a large part of electricity...... is processed through power electronics, highly efficient, sustainable, reliable and cost-effective power electronic devices are needed. Reliability of a product is defined as the ability to perform within its predefined functions under given conditions in a specific time. Because power electronic devices...

  4. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  5. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  6. Thermal behaviour modelling of superplastic forming tools

    OpenAIRE

    Velay , Vincent; Cutard , Thierry; Guegan , N.

    2008-01-01

    High-temperature operational conditions of super plastic forming (SPF) tools induce very complex thermomechanical loadings responsible to their failure. Various materials can be used to manufacture forming tools: ceramic, refractory castable or heat resistant steel. In this paper, an experimental and numerical analysis is performed in order to characterise the environmental loadings undergone by the tool whatever the considered material. This investigation allows to lead a thermal calculation...

  7. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  8. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  9. Simulation Tools Model Icing for Aircraft Design

    Science.gov (United States)

    2012-01-01

    the years from strictly a research tool to one used routinely by industry and other government agencies. Glenn contractor William Wright has been the architect of this development, supported by a team of researchers investigating icing physics, creating validation data, and ensuring development according to standard software engineering practices. The program provides a virtual simulation environment for determining where water droplets strike an airfoil in flight, what kind of ice would result, and what shape that ice would take. Users can enter geometries for specific, two-dimensional cross sections of an airfoil or other airframe surface and then apply a range of inputs - different droplet sizes, temperatures, airspeeds, and more - to model how ice would build up on the surface in various conditions. The program s versatility, ease of use, and speed - LEWICE can run through complex icing simulations in only a few minutes - have contributed to it becoming a popular resource in the aviation industry.

  10. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  11. Modeling and Simulation Tools for Heavy Lift Airships

    Science.gov (United States)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  12. Modeling and Tool Wear in Routing of CFRP

    International Nuclear Information System (INIS)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.; Girot, F.

    2011-01-01

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  13. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    Numerous design decisions including architectural decisions are made while developing a software system, which influence the architecture of the system as well as subsequent decisions. Several tools already exist for managing design decisions, i.e. capturing, documenting, and maintaining them......, but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models....... In this paper, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from an example: the UML modeling tool shall show all decisions related to a model and allow extending or updating them; the decision management tool shall trigger the modeling tool...

  14. Student Model Tools Code Release and Documentation

    DEFF Research Database (Denmark)

    Johnson, Matthew; Bull, Susan; Masci, Drew

    of its strengths and areas of improvement (Section 6). Several key appendices are attached to this report including user manuals for teacher and students (Appendix 3). Fundamentally, all relevant information is included in the report for those wishing to do further development work with the tool...

  15. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    . To illustrate these concepts a number of examples are used. These include models of polymer membranes, distillation and catalyst behaviour. Some detailed considerations within these models are stated and discussed. Model generation concepts are introduced and ideas of a reference model are given that shows...

  16. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  17. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  18. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  19. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  20. The scientific modeling assistant: An advanced software tool for scientific model building

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  1. Scratch as a computational modelling tool for teaching physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  2. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  3. Spatial Modeling Tools for Cell Biology

    Science.gov (United States)

    2006-10-01

    of the cells total volume. The cytosol contains thousands of enzymes that are responsible for the catalyzation of glycolysis and gluconeogenesis ... dog , swine and pig models [Pantely, 1990, 1991; Stanley 1992]. In these studies, blood flow through the left anterior descending (LAD) coronary...perfusion. In conclusion, even thought our model falls within the (rather large) error bounds of experimental dog , pig and swine models, the

  4. Towards a generalized energy prediction model for machine tools.

    Science.gov (United States)

    Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan

    2017-04-01

    Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.

  5. A model of tool wear monitoring system for turning

    OpenAIRE

    Šimunović, Goran; Ficko, Mirko; Šarić, Tomislav; Milošević, Mijodrag; Antić, Aco

    2015-01-01

    Acquiring high-quality and timely information on the tool wear condition in real time, presents a necessary prerequisite for identification of tool wear degree, which significantly improves the stability and quality of the machining process. Defined in this paper is a model of tool wear monitoring system with special emphasis on the module for acquisition and processing of vibration acceleration signal by applying discrete wavelet transformations (DWT) in signal decomposition. The paper prese...

  6. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates......, where the experimental effort could be focused. In this project a general modelling framework for systematic model building through modelling templates, which supports the reuse of existing models via its new model import and export capabilities, have been developed. The new feature for model transfer...... has been developed by establishing a connection with an external modelling environment for code generation. The main contribution of this thesis is a creation of modelling templates and their connection with other modelling tools within a modelling framework. The goal was to create a user...

  7. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  8. Graphical Tools for Linear Structural Equation Modeling

    Science.gov (United States)

    2014-06-01

    regression coefficient βS A.CQ1 van- ishes, which can be used to test whether the specification of Model 2 is compatible with the data. Most...because they are all compatible with the graph in Figure 19a, which displays the skeleton and v-structures. Note that we cannot reverse the edge from...im- plications of linear structual equation models. R-428, <http://ftp.cs.ucla.edu/pub/stat_ser/r428.pdf>, CA. To ap- pear in Proceedings of AAAI-2014

  9. Toposcopy : A modelling tool for CITYGML

    NARCIS (Netherlands)

    Groneman, A.; Zlatanova, S.

    2009-01-01

    The new 3D standard CityGML has been attracting a lot of attention in the last few years. Many characteristics of the XML-based format make it suitable for storage and exchange of virtual 3D city models. It provides possibilities to store semantic and geometric information and has the potential to

  10. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    -annual workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops......, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...

  11. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  12. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  13. Static Stiffness Modeling of Parallel Kinematics Machine Tool Joints

    OpenAIRE

    O. K. Akmaev; B. A. Enikeev; A. I. Nigmatullin

    2015-01-01

    The possible variants of an original parallel kinematics machine-tool structure are explored in this article. A new Hooke's universal joint design based on needle roller bearings with the ability of a preload setting is proposed. The bearing stiffness modeling is carried out using a variety of methods. The elastic deformation modeling of a Hook’s joint and a spherical rolling joint have been developed to assess the possibility of using these joints in machine tools with parallel k...

  14. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL, th...... model transformation tool sharing the model editor’s benefits, transparently....

  15. Risk Assessment in Fractured Clayey Tills - Which Modeling Tools?

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Bjerg, Poul Løgstrup; Binning, Philip John

    2012-01-01

    The article presents different tools available for risk assessment in fractured clayey tills and their advantages and limitations are discussed. Because of the complex processes occurring during contaminant transport through fractured media, the development of simple practical tools for risk...... assessment is challenging and the inclusion of the relevant processes is difficult. Furthermore the lack of long-term monitoring data prevents from verifying the accuracy of the different conceptual models. Further investigations based on long-term data and numerical modeling are needed to accurately...... describe contaminant transport in fractured media and develop practical tools with the relevant processes and level of complexity....

  16. Rasp Tool on Phoenix Robotic Arm Model

    Science.gov (United States)

    2008-01-01

    This close-up photograph taken at the Payload Interoperability Testbed at the University of Arizona, Tucson, shows the motorized rasp protruding from the bottom of the scoop on the engineering model of NASA's Phoenix Mars Lander's Robotic Arm. The rasp will be placed against the hard Martian surface to cut into the hard material and acquire an icy soil sample for analysis by Phoenix's scientific instruments. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  17. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  18. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted....

  19. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  20. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......’s preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation...

  1. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  2. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    African Journals Online (AJOL)

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  3. Advanced REACH Tool (ART) : Calibration of the mechanistic model

    NARCIS (Netherlands)

    Schinkel, J.; Warren, N.; Fransman, W.; Tongeren, M. van; McDonnell, P.; Voogd, E.; Cherrie, J.W.; Tischer, M.; Kromhout, H.; Tielemans, E.

    2011-01-01

    The mechanistic model of the Advanced Reach Tool (ART) provides a relative ranking of exposure levels from different scenarios. The objectives of the calibration described in this paper are threefold: to study whether the mechanistic model scores are accurately ranked in relation to exposure

  4. Molecular Modeling: A Powerful Tool for Drug Design and Molecular ...

    Indian Academy of Sciences (India)

    Molecular modeling has become a valuable and essential tool to medicinal chemists in the drug design process. Molecular modeling describes the generation, manipula- tion or representation of three-dimensional structures of molecules and associated physico-chemical properties. It involves a range of computerized ...

  5. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  6. Hypermedia as an experiential learning tool: a theoretical model

    OpenAIRE

    Jose Miguel Baptista Nunes; Susan P. Fowell

    1996-01-01

    The process of methodical design and development is of extreme importance in the production of educational software. However, this process will only be effective, if it is based on a theoretical model that explicitly defines what educational approach is being used and how specific features of the technology can best support it. This paper proposes a theoretical model of how hypermedia can be used as an experiential learning tool. The development of the model was based on a experiential learni...

  7. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  8. Static Stiffness Modeling of Parallel Kinematics Machine Tool Joints

    Directory of Open Access Journals (Sweden)

    O. K. Akmaev

    2015-09-01

    Full Text Available The possible variants of an original parallel kinematics machine-tool structure are explored in this article. A new Hooke's universal joint design based on needle roller bearings with the ability of a preload setting is proposed. The bearing stiffness modeling is carried out using a variety of methods. The elastic deformation modeling of a Hook’s joint and a spherical rolling joint have been developed to assess the possibility of using these joints in machine tools with parallel kinematics.

  9. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  10. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  11. Analytical Modelling Of Milling For Tool Design And Selection

    Science.gov (United States)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-05-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools.

  12. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  13. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  14. Designing tools for oil exploration using nuclear modeling

    Science.gov (United States)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  15. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  16. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  17. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  18. Accessing Curriculum Through Technology Tools (ACTTT): A Model Development Project

    Science.gov (United States)

    Daytner, Katrina M.; Johanson, Joyce; Clark, Letha; Robinson, Linda

    2012-01-01

    Accessing Curriculum Through Technology Tools (ACTTT), a project funded by the U.S. Office of Special Education Programs (OSEP), developed and tested a model designed to allow children in early elementary school, including those "at risk" and with disabilities, to better access, participate in, and benefit from the general curriculum.…

  19. Combining modelling tools to evaluate a goose management scheme

    NARCIS (Netherlands)

    Baveco, Hans; Bergjord, Anne Kari; Bjerke, Jarle W.; Chudzińska, Magda E.; Pellissier, Loïc; Simonsen, Caroline E.; Madsen, Jesper; Tombre, Ingunn M.; Nolet, Bart A.

    2017-01-01

    Many goose species feed on agricultural land, and with growing goose numbers, conflicts with agriculture are increasing. One possible solution is to designate refuge areas where farmers are paid to leave geese undisturbed. Here, we present a generic modelling tool that can be used to designate the

  20. Combining modelling tools to evaluate a goose management scheme.

    NARCIS (Netherlands)

    Baveco, J.M.; Bergjord, A.K.; Bjerke, J.W.; Chudzińska, M.E.; Pellissier, L.; Simonsen, C.E.; Madsen, J.; Tombre, Ingunn M.; Nolet, B.A.

    2017-01-01

    Many goose species feed on agricultural land, and with growing goose numbers, conflicts with agriculture are increasing. One possible solution is to designate refuge areas where farmers are paid to leave geese undisturbed. Here, we present a generic modelling tool that can be used to designate the

  1. Integrated landscape/hydrologic modeling tool for semiarid watersheds

    Science.gov (United States)

    Mariano Hernandez; Scott N. Miller

    2000-01-01

    An integrated hydrologic modeling/watershed assessment tool is being developed to aid in determining the susceptibility of semiarid landscapes to natural and human-induced changes across a range of scales. Watershed processes are by definition spatially distributed and are highly variable through time, and this approach is designed to account for their spatial and...

  2. Molecular Modeling: A Powerful Tool for Drug Design and Molecular ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 5. Molecular Modeling: A Powerful Tool for Drug Design and Molecular Docking. Rama Rao Nadendla. General Article Volume 9 Issue 5 May 2004 pp 51-60. Fulltext. Click here to view fulltext PDF. Permanent link:

  3. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  4. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  5. Greenhouse gases from wastewater treatment - A review of modelling tools.

    Science.gov (United States)

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Evaluating EML Modeling Tools for Insurance Purposes: A Case Study

    Directory of Open Access Journals (Sweden)

    Mikael Gustavsson

    2010-01-01

    Full Text Available As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

  7. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  8. A communication tool to improve the patient journey modeling process.

    Science.gov (United States)

    Curry, Joanne; McGregor, Carolyn; Tracy, Sally

    2006-01-01

    Quality improvement is high on the agenda of Health Care Organisations (HCO) worldwide. Patient journey modeling is a relatively recent innovation in healthcare quality improvement that models the patient's movement through the HCO by viewing it from a patient centric perspective. Critical to the success of the redesigning care process is the involvement of all stakeholders and their commitment to actively participate in the process. Tools which promote this type of communication are a critical enabler that can significantly affect the overall process redesign outcomes. Such a tool must also be able to incorporate additional factors such as relevant policies and procedures, staff roles, system usage and measurements such as process time and cost. This paper presents a graphically based communication tool that can be used as part of the patient journey modeling process to promote stakeholder involvement, commitment and ownership as well highlighting the relationship of other relevant variables that contribute to the patient's journey. Examples of how the tool has been used and the framework employed are demonstrated via a midwife-led primary care case study. A key contribution of this research is the provision of a graphical communication framework that is simple to use, is easily understood by a diverse range of stakeholders and enables ready recognition of patient journey issues. Results include strong stakeholder buy-in and significant enhancement to the overall design of the future patient journey. Initial results indicate that the use of such a communication tool can improve the patient journey modeling process and the overall quality improvement outcomes.

  9. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    -process design. Illustrative examples highlighting the need for efficient model-based systems will be presented, where the need for predictive models for innovative chemical product-process design will be highlighted. The examples will cover aspects of chemical product-process design where the idea of the grand......The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......, which can be expensive and time consuming. An alternative approach is the use of a systematic model-based framework according to an established work-flow in product-process design, replacing some of the time consuming and/or repetitive experimental steps. The advantages of the use of a model...

  10. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  11. Numerical Model Metrics Tools in Support of Navy Operations

    Science.gov (United States)

    Dykes, J. D.; Fanguy, P.

    2017-12-01

    Increasing demands of accurate ocean forecasts that are relevant to the Navy mission decision makers demand tools that quickly provide relevant numerical model metrics to the forecasters. Increasing modelling capabilities with ever-higher resolution domains including coupled and ensemble systems as well as the increasing volume of observations and other data sources to which to compare the model output requires more tools for the forecaster to enable doing more with less. These data can be appropriately handled in a geographic information system (GIS) fused together to provide useful information and analyses, and ultimately a better understanding how the pertinent model performs based on ground truth.. Oceanographic measurements like surface elevation, profiles of temperature and salinity, and wave height can all be incorporated into a set of layers correlated to geographic information such as bathymetry and topography. In addition, an automated system that runs concurrently with the models on high performance machines matches routinely available observations to modelled values to form a database of matchups with which statistics can be calculated and displayed, to facilitate validation of forecast state and derived variables. ArcMAP, developed by Environmental Systems Research Institute, is a GIS application used by the Naval Research Laboratory (NRL) and naval operational meteorological and oceanographic centers to analyse the environment in support of a range of Navy missions. For example, acoustic propagation in the ocean is described with a three-dimensional analysis of sound speed that depends on profiles of temperature, pressure and salinity predicted by the Navy Coastal Ocean Model. The data and model output must include geo-referencing information suitable for accurately placing the data within the ArcMAP framework. NRL has developed tools that facilitate merging these geophysical data and their analyses, including intercomparisons between model

  12. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    Science.gov (United States)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  13. MODELING OF ANIMATED SIMULATIONS BY MAXIMA PROGRAM TOOLS

    Directory of Open Access Journals (Sweden)

    Nataliya O. Bugayets

    2015-06-01

    Full Text Available The article deals with the methodical features in training of computer simulation of systems and processes using animation. In the article the importance of visibility of educational material that combines sensory and thinking sides of cognition is noted. The concept of modeling and the process of building models has been revealed. Attention is paid to the development of skills that are essential for effective learning of animated simulation by visual aids. The graphical environment tools of the computer mathematics system Maxima for animated simulation are described. The examples of creation of models animated visual aids and their use for the development of research skills are presented.

  14. Transfer Entropy as a Tool for Hydrodynamic Model Validation

    Directory of Open Access Journals (Sweden)

    Alicia Sendrowski

    2018-01-01

    Full Text Available The validation of numerical models is an important component of modeling to ensure reliability of model outputs under prescribed conditions. In river deltas, robust validation of models is paramount given that models are used to forecast land change and to track water, solid, and solute transport through the deltaic network. We propose using transfer entropy (TE to validate model results. TE quantifies the information transferred between variables in terms of strength, timescale, and direction. Using water level data collected in the distributary channels and inter-channel islands of Wax Lake Delta, Louisiana, USA, along with modeled water level data generated for the same locations using Delft3D, we assess how well couplings between external drivers (river discharge, tides, wind and modeled water levels reproduce the observed data couplings. We perform this operation through time using ten-day windows. Modeled and observed couplings compare well; their differences reflect the spatial parameterization of wind and roughness in the model, which prevents the model from capturing high frequency fluctuations of water level. The model captures couplings better in channels than on islands, suggesting that mechanisms of channel-island connectivity are not fully represented in the model. Overall, TE serves as an additional validation tool to quantify the couplings of the system of interest at multiple spatial and temporal scales.

  15. Neural Networks for Hydrological Modeling Tool for Operational Purposes

    Science.gov (United States)

    Bhatt, Divya; Jain, Ashu

    2010-05-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models

  16. Theoretical Modeling of Rock Breakage by Hydraulic and Mechanical Tool

    Directory of Open Access Journals (Sweden)

    Hongxiang Jiang

    2014-01-01

    Full Text Available Rock breakage by coupled mechanical and hydraulic action has been developed over the past several decades, but theoretical study on rock fragmentation by mechanical tool with water pressure assistance was still lacking. The theoretical model of rock breakage by mechanical tool was developed based on the rock fracture mechanics and the solution of Boussinesq’s problem, and it could explain the process of rock fragmentation as well as predicating the peak reacting force. The theoretical model of rock breakage by coupled mechanical and hydraulic action was developed according to the superposition principle of intensity factors at the crack tip, and the reacting force of mechanical tool assisted by hydraulic action could be reduced obviously if the crack with a critical length could be produced by mechanical or hydraulic impact. The experimental results indicated that the peak reacting force could be reduced about 15% assisted by medium water pressure, and quick reduction of reacting force after peak value decreased the specific energy consumption of rock fragmentation by mechanical tool. The crack formation by mechanical or hydraulic impact was the prerequisite to improvement of the ability of combined breakage.

  17. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  18. Designing a training tool for imaging mental models

    Science.gov (United States)

    Dede, Christopher J.; Jayaram, Geetha

    1990-01-01

    The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.

  19. ADAS tools for collisional–radiative modelling of molecules

    Energy Technology Data Exchange (ETDEWEB)

    Guzmán, F., E-mail: francisco.guzman@cea.fr [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom); CEA, IRFM, Saint-Paul-lez-Durance 13108 (France); O’Mullane, M.; Summers, H.P. [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom)

    2013-07-15

    New theoretical and computational tools for molecular collisional–radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H{sub 2} are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional–radiative) rate coefficients versus temperature and density are presented.

  20. ADAS tools for collisional-radiative modelling of molecules

    Science.gov (United States)

    Guzmán, F.; O'Mullane, M.; Summers, H. P.

    2013-07-01

    New theoretical and computational tools for molecular collisional-radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H2 are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional-radiative) rate coefficients versus temperature and density are presented.

  1. Introduction to genetic algorithms as a modeling tool

    International Nuclear Information System (INIS)

    Wildberger, A.M.; Hickok, K.A.

    1990-01-01

    Genetic algorithms are search and classification techniques modeled on natural adaptive systems. This is an introduction to their use as a modeling tool with emphasis on prospects for their application in the power industry. It is intended to provide enough background information for its audience to begin to follow technical developments in genetic algorithms and to recognize those which might impact on electric power engineering. Beginning with a discussion of genetic algorithms and their origin as a model of biological adaptation, their advantages and disadvantages are described in comparison with other modeling tools such as simulation and neural networks in order to provide guidance in selecting appropriate applications. In particular, their use is described for improving expert systems from actual data and they are suggested as an aid in building mathematical models. Using the Thermal Performance Advisor as an example, it is suggested how genetic algorithms might be used to make a conventional expert system and mathematical model of a power plant adapt automatically to changes in the plant's characteristics

  2. Surviving the present: Modeling tools for organizational change

    Energy Technology Data Exchange (ETDEWEB)

    Pangaro, P. (Pangaro Inc., Washington, DC (United States))

    1992-01-01

    The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them.

  3. Surviving the present: Modeling tools for organizational change

    International Nuclear Information System (INIS)

    Pangaro, P.

    1992-01-01

    The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them

  4. Modelling stillbirth mortality reduction with the Lives Saved Tool

    Directory of Open Access Journals (Sweden)

    Hannah Blencowe

    2017-11-01

    Full Text Available Abstract Background The worldwide burden of stillbirths is large, with an estimated 2.6 million babies stillborn in 2015 including 1.3 million dying during labour. The Every Newborn Action Plan set a stillbirth target of ≤12 per 1000 in all countries by 2030. Planning tools will be essential as countries set policy and plan investment to scale up interventions to meet this target. This paper summarises the approach taken for modelling the impact of scaling-up health interventions on stillbirths in the Lives Saved tool (LiST, and potential future refinements. Methods The specific application to stillbirths of the general method for modelling the impact of interventions in LiST is described. The evidence for the effectiveness of potential interventions to reduce stillbirths are reviewed and the assumptions of the affected fraction of stillbirths who could potentially benefit from these interventions are presented. The current assumptions and their effects on stillbirth reduction are described and potential future improvements discussed. Results High quality evidence are not available for all parameters in the LiST stillbirth model. Cause-specific mortality data is not available for stillbirths, therefore stillbirths are modelled in LiST using an attributable fraction approach by timing of stillbirths (antepartum/ intrapartum. Of 35 potential interventions to reduce stillbirths identified, eight interventions are currently modelled in LiST. These include childbirth care, induction for prolonged pregnancy, multiple micronutrient and balanced energy supplementation, malaria prevention and detection and management of hypertensive disorders of pregnancy, diabetes and syphilis. For three of the interventions, childbirth care, detection and management of hypertensive disorders of pregnancy, and diabetes the estimate of effectiveness is based on expert opinion through a Delphi process. Only for malaria is coverage information available, with coverage

  5. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  6. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  7. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology......, analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...

  8. Evaluation of air pollution modelling tools as environmental engineering courseware.

    Science.gov (United States)

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results.

  9. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  10. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  11. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  12. Right approach to 3D modeling using CAD tools

    Science.gov (United States)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  13. The EDF/SEPTEN crisis team calculation tools and models

    International Nuclear Information System (INIS)

    De Magondeaux, B.; Grimaldi, X.

    1993-01-01

    Electricite de France (EDF) has developed a set of simplified tools and models called TOUTEC and CRISALIDE which are devoted to be used by the French utility National Crisis Team in order to perform the task of diagnosis and prognosis during an emergency situation. As a severe accident could have important radiological consequences, this method is focused on the diagnosis of the state of the safety barriers and on the prognosis of their behaviour. These tools allow the crisis team to deliver public authorities with information on the radiological risk and to provide advices to manage the accident on the damaged unit. At a first level, TOUTEC is intended to complement the hand-book with simplified calculation models and predefined relationships. It can avoid tedious calculation during stress conditions. The main items are the calculation of the primary circuit breach size and the evaluation of hydrogen over pressurization. The set of models called CRISALIDE is devoted to evaluate the following critical parameters: delay before core uncover, which would signify more severe consequences if it occurs, containment pressure behaviour and finally source term. With these models, crisis team comes able to take into account combinations of boundary conditions according to safety and auxiliary systems availability

  14. MODERN TOOLS FOR MODELING ACTIVITY IT-COMPANIES

    Directory of Open Access Journals (Sweden)

    Марина Петрівна ЧАЙКОВСЬКА

    2015-05-01

    Full Text Available Increasing competition in the market of the web-based applications increases the importance of the quality of services and optimization of processes of interaction with customers. The purpose of the article is to develop recommendations for improving the business processes of IT enterprises of web application segment based on technological tools for business modeling, shaping requirements for the development of an information system for customer interaction; analysis of the effective means of implementation and evaluation of the economic effects of the introduction. A scheme of the business process development and launch of the website was built, based on the analysis of business process models and “swim lane” models, requirements for IP customer relationship management for web studio were established. Market of software to create IP was analyzed, and the ones corresponding to the requirements were selected. IP system was developed and tested, implemented it in the company, an appraisal of the economic effect was conducted.

  15. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  16. A new model for the sonic borehole logging tool

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1990-12-01

    A number of models for the sonic borehole logging tool has earlier been developed. These models which are mainly based on experimental data, are discussed and compared. On this background the new model is developed. It is based on the assumptions that the pores of low porosity formations and the grains of high porosity media may be approximated by cylinders, and that the dimension of these cylinders are given by distribution functions. From these assumptions the transit time Δt p of low porosity formations and Δt g of high porosity media are calculated by use of the Monte Carlo method. Combining the Δt p and Δt g values obtained by use of selected weighting functions seems to permit the determination of the transit time Δt for the full porosity range (0 ≤ φ ≤ 100%). (author)

  17. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... speed doubly-fed induction generator wind turbine concept 3. Variable speed multi-pole permanent magnet synchronous generator wind turbine concept These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies......, connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control...

  18. Mathematical modeling of physiological systems: an essential tool for discovery.

    Science.gov (United States)

    Glynn, Patric; Unudurthi, Sathya D; Hund, Thomas J

    2014-08-28

    Mathematical models are invaluable tools for understanding the relationships between components of a complex system. In the biological context, mathematical models help us understand the complex web of interrelations between various components (DNA, proteins, enzymes, signaling molecules etc.) in a biological system, gain better understanding of the system as a whole, and in turn predict its behavior in an altered state (e.g. disease). Mathematical modeling has enhanced our understanding of multiple complex biological processes like enzyme kinetics, metabolic networks, signal transduction pathways, gene regulatory networks, and electrophysiology. With recent advances in high throughput data generation methods, computational techniques and mathematical modeling have become even more central to the study of biological systems. In this review, we provide a brief history and highlight some of the important applications of modeling in biological systems with an emphasis on the study of excitable cells. We conclude with a discussion about opportunities and challenges for mathematical modeling going forward. In a larger sense, the review is designed to help answer a simple but important question that theoreticians frequently face from interested but skeptical colleagues on the experimental side: "What is the value of a model?" Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  20. Fuzzy regression modeling for tool performance prediction and degradation detection.

    Science.gov (United States)

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  1. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  2. Conceptual Models as Tools for Communication Across Disciplines

    Directory of Open Access Journals (Sweden)

    Marieke Heemskerk

    2003-12-01

    Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.

  3. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  4. ISRU System Model Tool: From Excavation to Oxygen Production

    Science.gov (United States)

    Santiago-Maldonado, Edgardo; Linne, Diane L.

    2007-01-01

    In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.

  5. Modeling and Control of the Cobelli Model as a Personalized Prescriptive Tool for Diabetes Treatment

    Science.gov (United States)

    2016-11-05

    physiological accurate model allows for the use of control theory to investigate applications as a personalized prescription tool. This research...physiological accurate model allows for the use of control theory to investigate applications as a personalized prescription tool. This research...utilization increases toward healthy levels. The second pathway is by decreasing the endogenous glucose production of the liver to the bloodstream [6,7

  6. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    Science.gov (United States)

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  7. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  8. MTK: An AI tool for model-based reasoning

    Science.gov (United States)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  9. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  10. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  11. Edge effect modeling of small tool polishing in planetary movement

    Science.gov (United States)

    Li, Qi-xin; Ma, Zhen; Jiang, Bo; Yao, Yong-sheng

    2018-03-01

    As one of the most challenging problems in Computer Controlled Optical Surfacing (CCOS), the edge effect greatly affects the polishing accuracy and efficiency. CCOS rely on stable tool influence function (TIF), however, at the edge of the mirror surface,with the grinding head out of the mirror ,the contact area and pressure distribution changes, which resulting in a non-linear change of TIF, and leads to tilting or sagging at the edge of the mirror. In order reduce the adverse effects and improve the polishing accuracy and efficiency. In this paper, we used the finite element simulation to analyze the pressure distribution at the mirror edge and combined with the improved traditional method to establish a new model. The new method fully considered the non-uniformity of pressure distribution. After modeling the TIFs in different locations, the description and prediction of the edge effects are realized, which has a positive significance on the control and suppression of edge effects

  12. Empirical flow parameters : a tool for hydraulic model validity

    Science.gov (United States)

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  13. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  14. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  15. MODEL CAR TRANSPORT SYSTEM - MODERN ITS EDUCATION TOOL

    Directory of Open Access Journals (Sweden)

    Karel Bouchner

    2017-12-01

    Full Text Available The model car transport system is a laboratory intended for a practical development in the area of the motor traffic. It is also an important education tool for students’ hands-on training, enabling students to test the results of their own studies. The main part of the model car transportation network is a model in a ratio 1:87 (HO, based on component units of FALLER Car system, e.g. cars, traffic lights, carriage way, parking spaces, stop sections, branch-off junctions, sensors and control sections. The model enables to simulate real traffic situations. It includes a motor traffic in a city, in a small village, on a carriageway between a city and a village including a railway crossing. The traffic infrastructure includes different kinds of intersections, such as T-junctions, a classic four-way crossroad and four-way traffic circle, with and without traffic lights control. Another important part of the model is a segment of a highway which includes an elevated crossing with highway approaches and exits.

  16. An artificial intelligence tool for complex age-depth models

    Science.gov (United States)

    Bradley, E.; Anderson, K. A.; de Vesine, L. R.; Lai, V.; Thomas, M.; Nelson, T. H.; Weiss, I.; White, J. W. C.

    2017-12-01

    CSciBox is an integrated software system for age modeling of paleoenvironmental records. It incorporates an array of data-processing and visualization facilities, ranging from 14C calibrations to sophisticated interpolation tools. Using CSciBox's GUI, a scientist can build custom analysis pipelines by composing these built-in components or adding new ones. Alternatively, she can employ CSciBox's automated reasoning engine, Hobbes, which uses AI techniques to perform an in-depth, autonomous exploration of the space of possible age-depth models and presents the results—both the models and the reasoning that was used in constructing and evaluating them—to the user for her inspection. Hobbes accomplishes this using a rulebase that captures the knowledge of expert geoscientists, which was collected over the course of more than 100 hours of interviews. It works by using these rules to generate arguments for and against different age-depth model choices for a given core. Given a marine-sediment record containing uncalibrated 14C dates, for instance, Hobbes tries CALIB-style calibrations using a choice of IntCal curves, with reservoir age correction values chosen from the 14CHRONO database using the lat/long information provided with the core, and finally composes the resulting age points into a full age model using different interpolation methods. It evaluates each model—e.g., looking for outliers or reversals—and uses that information to guide the next steps of its exploration, and presents the results to the user in human-readable form. The most powerful of CSciBox's built-in interpolation methods is BACON, a Bayesian sedimentation-rate algorithm—a powerful but complex tool that can be difficult to use. Hobbes adjusts BACON's many parameters autonomously to match the age model to the expectations of expert geoscientists, as captured in its rulebase. It then checks the model against the data and iteratively re-calculates until it is a good fit to the data.

  17. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  18. Slab2 - Updated Subduction Zone Geometries and Modeling Tools

    Science.gov (United States)

    Moore, G.; Hayes, G. P.; Portner, D. E.; Furtney, M.; Flamme, H. E.; Hearne, M. G.

    2017-12-01

    The U.S. Geological Survey database of global subduction zone geometries (Slab1.0), is a highly utilized dataset that has been applied to a wide range of geophysical problems. In 2017, these models have been improved and expanded upon as part of the Slab2 modeling effort. With a new data driven approach that can be applied to a broader range of tectonic settings and geophysical data sets, we have generated a model set that will serve as a more comprehensive, reliable, and reproducible resource for three-dimensional slab geometries at all of the world's convergent margins. The newly developed framework of Slab2 is guided by: (1) a large integrated dataset, consisting of a variety of geophysical sources (e.g., earthquake hypocenters, moment tensors, active-source seismic survey images of the shallow slab, tomography models, receiver functions, bathymetry, trench ages, and sediment thickness information); (2) a dynamic filtering scheme aimed at constraining incorporated seismicity to only slab related events; (3) a 3-D data interpolation approach which captures both high resolution shallow geometries and instances of slab rollback and overlap at depth; and (4) an algorithm which incorporates uncertainties of contributing datasets to identify the most probable surface depth over the extent of each subduction zone. Further layers will also be added to the base geometry dataset, such as historic moment release, earthquake tectonic providence, and interface coupling. Along with access to several queryable data formats, all components have been wrapped into an open source library in Python, such that suites of updated models can be released as further data becomes available. This presentation will discuss the extent of Slab2 development, as well as the current availability of the model and modeling tools.

  19. Using Modeling Tools to Better Understand Permafrost Hydrology

    Directory of Open Access Journals (Sweden)

    Clément Fabre

    2017-06-01

    Full Text Available Modification of the hydrological cycle and, subsequently, of other global cycles is expected in Arctic watersheds owing to global change. Future climate scenarios imply widespread permafrost degradation caused by an increase in air temperature, and the expected effect on permafrost hydrology is immense. This study aims at analyzing, and quantifying the daily water transfer in the largest Arctic river system, the Yenisei River in central Siberia, Russia, partially underlain by permafrost. The semi-distributed SWAT (Soil and Water Assessment Tool hydrological model has been calibrated and validated at a daily time step in historical discharge simulations for the 2003–2014 period. The model parameters have been adjusted to embrace the hydrological features of permafrost. SWAT is shown capable to estimate water fluxes at a daily time step, especially during unfrozen periods, once are considered specific climatic and soils conditions adapted to a permafrost watershed. The model simulates average annual contribution to runoff of 263 millimeters per year (mm yr−1 distributed as 152 mm yr−1 (58% of surface runoff, 103 mm yr−1 (39% of lateral flow and 8 mm yr−1 (3% of return flow from the aquifer. These results are integrated on a reduced basin area downstream from large dams and are closer to observations than previous modeling exercises.

  20. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; DeStefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  1. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  2. WIFIRE Data Model and Catalog for Wildfire Data and Tools

    Science.gov (United States)

    Altintas, I.; Crawl, D.; Cowart, C.; Gupta, A.; Block, J.; de Callafon, R.

    2014-12-01

    The WIFIRE project (wifire.ucsd.edu) is building an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. WIFIRE may be used by wildfire management authorities in the future to predict wildfire rate of spread and direction, and assess the effectiveness of high-density sensor networks in improving fire and weather predictions. WIFIRE has created a data model for wildfire resources including sensed and archived data, sensors, satellites, cameras, modeling tools, workflows and social information including Twitter feeds. This data model and associated wildfire resource catalog includes a detailed description of the HPWREN sensor network, SDG&E's Mesonet, and NASA MODIS. In addition, the WIFIRE data-model describes how to integrate the data from multiple heterogeneous sources to provide detailed fire-related information. The data catalog describes 'Observables' captured by each instrument using multiple ontologies including OGC SensorML and NASA SWEET. Observables include measurements such as wind speed, air temperature, and relative humidity, as well as their accuracy and resolution. We have implemented a REST service for publishing to and querying from the catalog using Web Application Description Language (WADL). We are creating web-based user interfaces and mobile device Apps that use the REST interface for dissemination to wildfire modeling community and project partners covering academic, private, and government laboratories while generating value to emergency officials and the general public. Additionally, the Kepler scientific workflow system is instrumented to interact with this data catalog to access real-time streaming and archived wildfire data and stream it into dynamic data-driven wildfire models at scale.

  3. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  4. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  5. The Innsbruck/ESO sky models and telluric correction tools*

    Directory of Open Access Journals (Sweden)

    Kimeswenger S.

    2015-01-01

    While the ground based astronomical observatories just have to correct for the line-of-sight integral of these effects, the Čerenkov telescopes use the atmosphere as the primary detector. The measured radiation originates at lower altitudes and does not pass through the entire atmosphere. Thus, a decent knowledge of the profile of the atmosphere at any time is required. The latter cannot be achieved by photometric measurements of stellar sources. We show here the capabilities of our sky background model and data reduction tools for ground-based optical/infrared telescopes. Furthermore, we discuss the feasibility of monitoring the atmosphere above any observing site, and thus, the possible application of the method for Čerenkov telescopes.

  6. Development of hydrogeological modelling tools based on NAMMU

    International Nuclear Information System (INIS)

    Marsic, N.; Hartley, L.; Jackson, P.; Poole, M.; Morvik, A.

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  7. Extending the Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2016-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…

  8. Planning the network of gas pipelines through modeling tools

    Energy Technology Data Exchange (ETDEWEB)

    Sucupira, Marcos L.L.; Lutif Filho, Raimundo B. [Companhia de Gas do Ceara (CEGAS), Fortaleza, CE (Brazil)

    2009-07-01

    Natural gas is a source of non-renewable energy used by different sectors of the economy of Ceara. Its use may be industrial, residential, commercial, as a source of automotive fuel, as a co-generation of energy and as a source for generating electricity from heat. For its practicality this energy has a strong market acceptance and provides a broad list of clients to fit their use, which makes it possible to reach diverse parts of the city. Its distribution requires a complex network of pipelines that branches throughout the city to meet all potential clients interested in this source of energy. To facilitate the design, analysis, expansion and location of bottlenecks and breaks in the distribution network, a modeling software is used that allows the network manager of the net to manage the various information about the network. This paper presents the advantages of modeling the gas distribution network of natural gas companies in Ceara, showing the tool used, the steps necessary for the implementation of the models, the advantages of using the software and the findings obtained with its use. (author)

  9. Complex Coronary Hemodynamics - Simple Analog Modelling as an Educational Tool.

    Science.gov (United States)

    Parikh, Gaurav R; Peter, Elvis; Kakouros, Nikolaos

    2017-01-01

    Invasive coronary angiography remains the cornerstone for evaluation of coronary stenoses despite there being a poor correlation between luminal loss assessment by coronary luminography and myocardial ischemia. This is especially true for coronary lesions deemed moderate by visual assessment. Coronary pressure-derived fractional flow reserve (FFR) has emerged as the gold standard for the evaluation of hemodynamic significance of coronary artery stenosis, which is cost effective and leads to improved patient outcomes. There are, however, several limitations to the use of FFR including the evaluation of serial stenoses. In this article, we discuss the electronic-hydraulic analogy and the utility of simple electrical modelling to mimic the coronary circulation and coronary stenoses. We exemplify the effect of tandem coronary lesions on the FFR by modelling of a patient with sequential disease segments and complex anatomy. We believe that such computational modelling can serve as a powerful educational tool to help clinicians better understand the complexity of coronary hemodynamics and improve patient care.

  10. A crowdsourcing model for creating preclinical medical education study tools.

    Science.gov (United States)

    Bow, Hansen C; Dattilo, Jonathan R; Jonas, Andrea M; Lehmann, Christoph U

    2013-06-01

    During their preclinical course work, medical students must memorize and recall substantial amounts of information. Recent trends in medical education emphasize collaboration through team-based learning. In the technology world, the trend toward collaboration has been characterized by the crowdsourcing movement. In 2011, the authors developed an innovative approach to team-based learning that combined students' use of flashcards to master large volumes of content with a crowdsourcing model, using a simple informatics system to enable those students to share in the effort of generating concise, high-yield study materials. The authors used Google Drive and developed a simple Java software program that enabled students to simultaneously access and edit sets of questions and answers in the form of flashcards. Through this crowdsourcing model, medical students in the class of 2014 at the Johns Hopkins University School of Medicine created a database of over 16,000 questions that corresponded to the Genes to Society basic science curriculum. An analysis of exam scores revealed that students in the class of 2014 outperformed those in the class of 2013, who did not have access to the flashcard system, and a survey of students demonstrated that users were generally satisfied with the system and found it a valuable study tool. In this article, the authors describe the development and implementation of their crowdsourcing model for creating study materials, emphasize its simplicity and user-friendliness, describe its impact on students' exam performance, and discuss how students in any educational discipline could implement a similar model of collaborative learning.

  11. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  12. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  13. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  14. An Integrated Package of Neuromusculoskeletal Modeling Tools in Simulink (TM)

    National Research Council Canada - National Science Library

    Davoodi, R

    2001-01-01

    .... Blocks representing the skeletal linkage, sensors, muscles, and neural controllers are developed using separate software tools and integrated in the powerful simulation environment of Simulink (Mathworks Inc., USA...

  15. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers

    Directory of Open Access Journals (Sweden)

    Khalil Al Handawi

    2017-09-01

    Full Text Available Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber’s modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  16. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  17. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  18. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  19. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  20. Use of System Dynamics Techniques in the Garrison Health Modelling Tool

    Science.gov (United States)

    2010-11-01

    Joint Health Command (JHC) tasked DSTO to develop techniques for modelling Defence health service delivery both in a Garrison environment in Australia ...UNCLASSIFIED UNCLASSIFIED Use of System Dynamics Techniques in the Garrison Health Modelling Tool Mark Burnett, Kerry Clifford and...Garrison Health Modelling Tool, a prototype software package designed to provide decision-support to JHC health officers and managers in a garrison

  1. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  2. A Modeling Tool for Household Biogas Burner Flame Port Design

    Science.gov (United States)

    Decker, Thomas J.

    Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.

  3. SBML qualitative models: a model representation format and infrastructure to foster interactions between qualitative modelling formalisms and tools.

    Science.gov (United States)

    Chaouiya, Claudine; Bérenguier, Duncan; Keating, Sarah M; Naldi, Aurélien; van Iersel, Martijn P; Rodriguez, Nicolas; Dräger, Andreas; Büchel, Finja; Cokelaer, Thomas; Kowal, Bryan; Wicks, Benjamin; Gonçalves, Emanuel; Dorier, Julien; Page, Michel; Monteiro, Pedro T; von Kamp, Axel; Xenarios, Ioannis; de Jong, Hidde; Hucka, Michael; Klamt, Steffen; Thieffry, Denis; Le Novère, Nicolas; Saez-Rodriguez, Julio; Helikar, Tomáš

    2013-12-10

    Qualitative frameworks, especially those based on the logical discrete formalism, are increasingly used to model regulatory and signalling networks. A major advantage of these frameworks is that they do not require precise quantitative data, and that they are well-suited for studies of large networks. While numerous groups have developed specific computational tools that provide original methods to analyse qualitative models, a standard format to exchange qualitative models has been missing. We present the Systems Biology Markup Language (SBML) Qualitative Models Package ("qual"), an extension of the SBML Level 3 standard designed for computer representation of qualitative models of biological networks. We demonstrate the interoperability of models via SBML qual through the analysis of a specific signalling network by three independent software tools. Furthermore, the collective effort to define the SBML qual format paved the way for the development of LogicalModel, an open-source model library, which will facilitate the adoption of the format as well as the collaborative development of algorithms to analyse qualitative models. SBML qual allows the exchange of qualitative models among a number of complementary software tools. SBML qual has the potential to promote collaborative work on the development of novel computational approaches, as well as on the specification and the analysis of comprehensive qualitative models of regulatory and signalling networks.

  4. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  5. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  6. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Susannah B. Lerman; Keith H. Nislow; David J. Nowak; Stephen DeStefano; David I. King; D. Todd. Jones-Farrand

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat...

  7. Scenario Evaluator for Electrical Resistivity Survey Pre-modeling Tool

    Science.gov (United States)

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, su...

  8. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  9. An Energy Systems Modelling Tool for the Social Simulation Community

    NARCIS (Netherlands)

    Bollinger, L. Andrew; van Blijswijk, Martti J.; Dijkema, Gerard P.J.; Nikolic, Igor

    2016-01-01

    The growing importance of links between the social and technical dimensions of the electricity infrastructure mean that many research problems cannot be effectively addressed without joint consideration of social and technical dynamics. This paper motivates the need for and introduces a tool to

  10. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  11. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  12. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  13. Software tools for object-based audio production using the Audio Definition Model

    OpenAIRE

    Matthias , Geier; Carpentier , Thibaut; Noisternig , Markus; Warusfel , Olivier

    2017-01-01

    International audience; We present a publicly available set of tools for the integration of the Audio Definition Model (ADM) in production workflows. ADM is an open metadata model for the description of channel-, scene-, and object-based media within a Broadcast Wave Format (BWF) container. The software tools were developed within the European research project ORPHEUS (https://orpheus-audio.eu/) that aims at developing new end-to-end object-based media chains for broadcast. These tools allow ...

  14. A unified tool for performance modelling and prediction

    International Nuclear Information System (INIS)

    Gilmore, Stephen; Kloul, Leila

    2005-01-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony

  15. Implementing the Mother-Baby Model of Nursing Care Using Models and Quality Improvement Tools.

    Science.gov (United States)

    Brockman, Vicki

    As family-centered care has become the expected standard, many facilities follow the mother-baby model, in which care is provided to both a woman and her newborn in the same room by the same nurse. My facility employed a traditional model of nursing care, which was not evidence-based or financially sustainable. After implementing the mother-baby model, we experienced an increase in exclusive breastfeeding rates at hospital discharge, increased patient satisfaction, improved staff productivity and decreased salary costs, all while the number of births increased. Our change was successful because it was guided by the use of quality improvement tools, change theory and evidence-based practice models. © 2015 AWHONN.

  16. influence.ME: tools for detecting influential data in mixed effects models

    NARCIS (Netherlands)

    Nieuwenhuis, Rense; te Grotenhuis, M.; Pelzer, B.

    2012-01-01

    influence.ME provides tools for detecting influential data in mixed effects models. The application of these models has become common practice, but the development of diagnostic tools has lagged behind. influence.ME calculates standardized measures of influential data for the point estimates of

  17. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  18. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  19. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  20. Solid-state-drives (SSDs) modeling simulation tools & strategies

    CERN Document Server

    2017-01-01

    This book introduces simulation tools and strategies for complex systems of solid-state-drives (SSDs) which consist of a flash multi-core microcontroller plus NAND flash memories. It provides a broad overview of the most popular simulation tools, with special focus on open source solutions. VSSIM, NANDFlashSim and DiskSim are benchmarked against performances of real SSDs under different traffic workloads. PROs and CONs of each simulator are analyzed, and it is clearly indicated which kind of answers each of them can give and at a what price. It is explained, that speed and precision do not go hand in hand, and it is important to understand when to simulate what, and with which tool. Being able to simulate SSD’s performances is mandatory to meet time-to-market, together with product cost and quality. Over the last few years the authors developed an advanced simulator named “SSDExplorer” which has been used to evaluate multiple phenomena with great accuracy, from QoS (Quality Of Service) to Read Retry, fr...

  1. Tools and data for the geochemical modeling. Thermodynamic data for sulfur species and background salts and tools for the uncertainty analysis; WEDA. Werkzeuge und Daten fuer die Geochemische Modellierung. Thermodynamische Daten fuer Schwefelspezies und Hintergrundsalze sowie Tools zur Unsicherheitsanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Hagemann, Sven; Schoenwiese, Dagmar; Scharge, Tina

    2015-07-15

    The report on tools and data for the geochemical modeling covers the following issues: experimental methods and theoretical models, design of a thermodynamic model for reduced sulfur species, thermodynamic models for background salts, tools for the uncertainty and sensitivity analyses of geochemical equilibrium modeling.

  2. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to

  3. The Quantum Atomic Model "Electronium": A Successful Teaching Tool.

    Science.gov (United States)

    Budde, Marion; Niedderer, Hans; Scott, Philip; Leach, John

    2002-01-01

    Focuses on the quantum atomic model Electronium. Outlines the Bremen teaching approach in which this model is used, and analyzes the learning of two students as they progress through the teaching unit. (Author/MM)

  4. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavior...

  5. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  6. Gsflow-py: An integrated hydrologic model development tool

    Science.gov (United States)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  7. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  8. Regional models - Emerging research tools for synoptic meteorologists

    Science.gov (United States)

    Keyser, Daniel; Uccellini, Louis W.

    1987-01-01

    A number of regional-scale numerical weather prediction models are discussed together with their application to the study of the structure and the dynamics of mesoscale phenomena. Consideration is given to investigations of natural phenomena (such as midlatitude cyclones and related baroclinic disturbances; upper-level jet-front systems; surface frontal zones, squall lines, and rain bands; mesoscale convective systems; and severe-storm environments) in which two operational models and four research models are used for regional-model studies. It is shown that these models provide investigators with four-dimensional dynamically consistent data sets to supplement and extend those available from observations.

  9. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  10. WeedML: a Tool for Collaborative Weed Demographic Modeling

    OpenAIRE

    Holst, Niels

    2010-01-01

    WeedML is a proposed standard to formulate models of weed demography, or maybe even complex models in general, that are both transparent and straightforward to re-use as building blocks for new models. The paper describes the design and thoughts behind WeedML which relies on XML and object-oriented systems development. Proof-of-concept software is provided as open-source C++ code and executables that can be downloaded freely.

  11. Ecological Modeling: A Tool for the Urban Educator.

    Science.gov (United States)

    Spikes, Frank

    Ecological modeling is a holistic systems level approach to situational analysis which can be used in planning activities for lifelong learning in an urban setting. It is the purpose of this essay to present a discussion of ecological modeling in its pure or conceptual sense and concomitantly to translate this analysis into an effective and…

  12. simulation tools for electrical machines modelling: teaching and ...

    African Journals Online (AJOL)

    Dr Obe

    used to model non-linearites in synchronous machine. The machine is modeled in ... Electrical machines who are involved in engineering undergraduate education will find the script very useful in terms of ... Keywords: Asynchronous machine; MATLAB scripts; engineering education; skin-effect; saturation effect; dynamic ...

  13. KENO3D Visualization Tool for KENO V.a and KENO-VI Geometry Models

    International Nuclear Information System (INIS)

    Horwedel, J.E.; Bowman, S.M.

    2000-01-01

    Criticality safety analyses often require detailed modeling of complex geometries. Effective visualization tools can enhance checking the accuracy of these models. This report describes the KENO3D visualization tool developed at the Oak Ridge National Laboratory (ORNL) to provide visualization of KENO V.a and KENO-VI criticality safety models. The development of KENO3D is part of the current efforts to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system

  14. Steam Generator Analysis Tools and Modeling of Degradation Mechanisms

    International Nuclear Information System (INIS)

    Yetisir, M.; Pietralik, J.; Tapping, R.L.

    2004-01-01

    The degradation of steam generators (SGs) has a significant effect on nuclear heat transport system effectiveness and the lifetime and overall efficiency of a nuclear power plant. Hence, quantification of the effects of degradation mechanisms is an integral part of a SG degradation management strategy. Numerical analysis tools such as THIRST, a 3-dimensional (3D) thermal hydraulics code for recirculating SGs; SLUDGE, a 3D sludge prediction code; CHECWORKS a flow-accelerated corrosion prediction code for nuclear piping, PIPO-FE, a SG tube vibration code; and VIBIC and H3DMAP, 3D non-linear finite-element codes to predict SG tube fretting wear can be used to assess the impacts of various maintenance activities on SG thermal performance. These tools are also found to be invaluable at the design stage to influence the design by determining margins or by helping the designers minimize or avoid known degradation mechanisms. In this paper, the aforementioned numerical tools and their application to degradation mechanisms in CANDU recirculating SGs are described. In addition, the following degradation mechanisms are identified and their effect on SG thermal efficiency and lifetime are quantified: primary-side fouling, secondary-side fouling, fretting wear, and flow-accelerated corrosion (FAC). Primary-side tube inner diameter fouling has been a major contributor to SG thermal degradation. Using the results of thermalhydraulic analysis and field data, fouling margins are calculated. Individual effects of primary- and secondary-side fouling are separated through analyses, which allow station operators to decide what type of maintenance activity to perform and when to perform the maintenance activity. Prediction of the fretting-wear rate of tubes allows designers to decide on the number and locations of support plates and U-bend supports. The prediction of FAC rates for SG internals allows designers to select proper materials, and allows operators to adjust the SG maintenance

  15. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  16. Towards diagnostic tools for analysing Swarm data through model retrievals

    DEFF Research Database (Denmark)

    Kotsiaros, Stavros; Plank, Gernot; Haagmans, R.

    The objective of the Swarm mission is to provide the best ever survey of the geomagnetic field and its temporal dependency, and to gain new insights into improving our knowledge of the Earth’s interior and climate. The Swarm concept consists of a constellation of three satellites in three different...... polar orbits between 300 and 550 km altitude. Goal of the current study is to build tools and to analyze datasets, in order to allow a fast diagnosis of the Swarm system performance in orbit during the commission phase and operations of the spacecraft. The effects on the reconstruction of the magnetic...... to test the influence of ionospheric residual signal or the impact of data selection on the lithospheric retrieval. Initially, the study considers one satellite and emphasises on the lithospheric field reconstruction, but in a second step it is extended to a realistic Swarm constellation of three...

  17. Econometric Model – A Tool in Financial Management

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2011-06-01

    Full Text Available The economic situation in Romania requires from the trader a rigorous analysis of vulnerabilities and opportunities offered by the external environment and a careful analysis of internal environmental conditions in which the entity operates. In this context particular attention is paid to indicators presented in the financial statements. Many times they are a model for economic forecasts, future plans, basic business and businesses that use them with a good forecasting activity. In this paper we propose to analyze the comparative evolution of the main financial indicators highlighted in financial statements (profit and loss through a multi-equation econometric model, namely dynamic Keynesian model.

  18. Tactical Medical Logistics Planning Tool: Modeling Operational Risk Assessment

    National Research Council Canada - National Science Library

    Konoske, Paula

    2004-01-01

    ...) models the patient flow from the point of injury through more definitive care, and (2) supports operations research and systems analysis studies, operational risk assessment, and field medical services planning. TML+...

  19. An Introduction to Model Selection: Tools and Algorithms

    Directory of Open Access Journals (Sweden)

    Sébastien Hélie

    2006-03-01

    Full Text Available Model selection is a complicated matter in science, and psychology is no exception. In particular, the high variance in the object of study (i.e., humans prevents the use of Popper’s falsification principle (which is the norm in other sciences. Therefore, the desirability of quantitative psychological models must be assessed by measuring the capacity of the model to fit empirical data. In the present paper, an error measure (likelihood, as well as five methods to compare model fits (the likelihood ratio test, Akaike’s information criterion, the Bayesian information criterion, bootstrapping and cross-validation, are presented. The use of each method is illustrated by an example, and the advantages and weaknesses of each method are also discussed.

  20. Modeling and Calculator Tools for State and Local Transportation Resources

    Science.gov (United States)

    Air quality models, calculators, guidance and strategies are offered for estimating and projecting vehicle air pollution, including ozone or smog-forming pollutants, particulate matter and other emissions that pose public health and air quality concerns.

  1. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    Science.gov (United States)

    2012-08-01

    1.56 0.48 Beale: MetalMapper Cued: Beale_MMstat Target: 477 Cell 202 of 1547 (SOI, 2OI) Model 1 of 3 (Inv #1 / 2 = SOI: 1 / 1) Tag...Statistical classification of buried unexploded ordnance using nonparametric prior models. IEEE Trans. Geosci. Remote Sensing, 45: 2794–2806, 2007. T...Bell and B. Barrow. Subsurface discrimination using electromagnetic induction sensors. IEEE Trans. Geosci. Remote Sensing, 39:1286–1293, 2001. S. D

  2. Overview of software tools for modeling single event upsets in microelectronic devices

    Directory of Open Access Journals (Sweden)

    Anatoly Alexandrovich Smolin

    2016-10-01

    Full Text Available The paper presents the results of the analysis of existing simulation tools for evaluation of single event upset susceptibility of microelectronic devices with deep sub-micron feature sizes. This simulation tools are meant to replace obsolete approach to single event rate estimation based on integral rectangular parallelepiped model. Three main approaches implemented in simulation tools are considered: combined use of particle transport codes and rectangular parallelepiped model, combined use of particle transport codes and analytical models of charge collection and circuit simulators, and combined use of particle transport codes and TCAD simulators.

  3. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing

    2011-01-01

    This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools....... To identify promising models (or processes) for which the tools have not yet been constructed and start filling up these gaps. To propose ways to streamline the process of going from models to events, i.e. to make the process more user-friendly so that more people can get involved and perform serious collider...

  4. SARAH 4: A tool for (not only SUSY) model builders

    Science.gov (United States)

    Staub, Florian

    2014-06-01

    We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@TE. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.

  5. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    Science.gov (United States)

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  6. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J.L.; Perez, L.; Gonzalez, R.M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wildland fire spread and behavior are complex Phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-FireChem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  7. Forest fire forecasting tool for air quality modelling systems

    International Nuclear Information System (INIS)

    San Jose, R.; Perez, J.L.; Perez, L.; Gonzalez, R.M.; Pecci, J.; Palacios, M.

    2015-01-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wildland fire spread and behavior are complex Phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-FireChem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  8. Forest fire forecasting tool for air quality modelling systems

    International Nuclear Information System (INIS)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Palacios, M.

    2015-01-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wild land fire spread and behavior are complex phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-Fire- Chem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  9. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wild land fire spread and behavior are complex phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-Fire- Chem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  10. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing

    2011-01-01

    already exist for the study of low energy supersymmetry and the MSSM in particular, this workshop will instead focus on tools for alternative TeV-scale physics models. The main goals of the workshop are: To survey what is available. To provide feedback on user experiences with Monte Carlo tools for BSM...

  11. Static Stiffness Modeling of a Novel PKM-Machine Tool Structure

    Directory of Open Access Journals (Sweden)

    O. K. Akmaev

    2014-07-01

    Full Text Available This article presents a new configuration of a 3-dof machine tool with parallel kinematics. Elastic deformations of the machine tool have been modeled with finite elements, stiffness coefficients at characteristic points of the working area for different cutting forces have been calculated.

  12. Biological profiling and dose-response modeling tools ...

    Science.gov (United States)

    Through its ToxCast project, the U.S. EPA has developed a battery of in vitro high throughput screening (HTS) assays designed to assess the potential toxicity of environmental chemicals. At present, over 1800 chemicals have been tested in up to 600 assays, yielding a large number of concentration-response data sets. Standard processing of these data sets involves finding a best fitting mathematical model and set of model parameters that specify this model. The model parameters include quantities such as the half-maximal activity concentration (or “AC50”) that have biological significance and can be used to inform the efficacy or potency of a given chemical with respect to a given assay. All of this data is processed and stored in an online-accessible database and website: http://actor.epa.gov/dashboard2. Results from these in vitro assays are used in a multitude of ways. New pathways and targets can be identified and incorporated into new or existing adverse outcome pathways (AOPs). Pharmacokinetic models such as those implemented EPA’s HTTK R package can be used to translate an in vitro concentration into an in vivo dose; i.e., one can predict the oral equivalent dose that might be expected to activate a specific biological pathway. Such predicted values can then be compared with estimated actual human exposures prioritize chemicals for further testing.Any quantitative examination should be accompanied by estimation of uncertainty. We are developing met

  13. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  14. CRISPR-Cas9: A Revolutionary Tool for Cancer Modelling

    Directory of Open Access Journals (Sweden)

    Raul Torres-Ruiz

    2015-09-01

    Full Text Available The cancer-modelling field is now experiencing a conversion with the recent emergence of the RNA-programmable CRISPR-Cas9 system, a flexible methodology to produce essentially any desired modification in the genome. Cancer is a multistep process that involves many genetic mutations and other genome rearrangements. Despite their importance, it is difficult to recapitulate the degree of genetic complexity found in patient tumors. The CRISPR-Cas9 system for genome editing has been proven as a robust technology that makes it possible to generate cellular and animal models that recapitulate those cooperative alterations rapidly and at low cost. In this review, we will discuss the innovative applications of the CRISPR-Cas9 system to generate new models, providing a new way to interrogate the development and progression of cancers.

  15. Models and tools for studying drought stress responses in peas.

    Science.gov (United States)

    Magyar-Tábori, Katalin; Mendler-Drienyovszki, Nóra; Dobránszki, Judit

    2011-12-01

    The pea (Pisum sativum L.) is an important pulse crop but the growing area is limited because of its relatively low yield stability. In many parts of the world the most important abiotic factor limiting the survival and yield of plants is the restricted water supply, and the crop productivity can only be increased by improving drought tolerance. Development of pea cultivars well adapted to dry conditions has been one of the major tasks in breeding programs. Conventional breeding of new cultivars for dry conditions required extensive selection and testing for yield performance over diverse environments using various biometrical approaches. Several morphological and biochemical traits have been proven to be related to drought resistance, and methods based on physiological attributes can also be used in development of better varieties. Osmoregulation plays a role in the maintenance of turgor pressure under water stress conditions, and information on the behaviour of genotypes under osmotic stress can help selection for drought resistance. Biotechnological approaches including in vitro test, genetic transformation, and the use of molecular markers and mutants could be useful tools in breeding of pea. In this minireview we summarized the present status of different approaches related to drought stress improvement in the pea.

  16. Queuing Models: A Tool For Assessing The Profitability Of Barbing ...

    African Journals Online (AJOL)

    The study considered small scale business as an option in reducing the unemployment rate in our society. The study uses queuing models to assess the profitability of barbing salon business in Agbor town of Delta State. The result of the study indicates that the distribution of inter-arrival times, service times, and waiting ...

  17. Numerical modeling as a tool for sustainable water management

    Science.gov (United States)

    Zacharias, I.; Dimitriou, E.; Koussouris, Th.

    2003-04-01

    Combining environmental preservation and economic prosperity is a primary objective of most developmental activities nowadays. Sustainable Water Resources Management can contribute in achieving this objective, especially in wetland areas that often undergo significant stresses due to irrational water exploitation schemes. Applying numerical modeling for designing sustainable water management scenarios is a common practice during the last decade but it is also under controversy by many scientists and environmental managers. The particular scientific effort attempted to develop and assess a methodology for the formation of water management plans in lake catchments by combining GIS applications, remote-sensing techniques and physically-based hydrologic modeling. The advantages and disadvantages of the specific methodology and particularly of the numerical modeling utilization in the water management forming process have been examined through a case study application in Trichonis lake catchment, W. Greece. At this area, significant wetlands with the endangered Calcareous fens habitat are encountered and presented significant degradation during the last 30 years. The results indicated that the particular methodology provided water management scenarios that fulfilled both the environmental and anthropogenic demands without compromising the replenishment potential of the local water resources. Numerical modeling operated efficiently, accelerated the water management formation process and offered scenarios that can be easily applicable and amendable by the local Water Authorities.

  18. Mathematical modelling : a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, H; Hellriegel, B

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  19. Mathematical modelling: a tool for hospital infection control

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B.

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  20. Mathematical modelling: a tool for hospital infection control.

    NARCIS (Netherlands)

    Grundmann, Hajo; Hellriegel, B

    2006-01-01

    Health-care-associated infections caused by antibiotic-resistant pathogens have become a menace in hospitals worldwide and infection control measures have lead to vastly different outcomes in different countries. During the past 6 years, a theoretical framework based on mathematical models has

  1. Agent-Based Modeling: A Powerful Tool for Tourism Researchers

    NARCIS (Netherlands)

    Nicholls, Sarah; Amelung, B.; Student, Jillian

    2017-01-01

    Agent-based modeling (ABM) is a way of representing complex systems of autonomous agents or actors, and of simulating the multiple potential outcomes of these agents’ behaviors and interactions in the form of a range of alternatives or futures. Despite the complexity of the tourism system, and the

  2. 3-C Models Teaching Tools to Promote Social Justice

    Science.gov (United States)

    Marbley, Aretha Faye; Rouson, Leon; Burley, Hansel; Ross, Wendy; Bonner, Fred A., II; Lértora, Ian; Huang, Shih-Han

    2017-01-01

    Equipping future professionals and educators with critical global multicultural competences and skills to work with people from diverse backgrounds is a challenge for both predominantly White institutions (PWIs) and Historically Black Colleges and Universities (HBCUs). The major objective of this article is to introduce an adaptable model with an…

  3. Modeling mind-wandering: a tool to better understand distraction

    NARCIS (Netherlands)

    van Vugt, Marieke; Taatgen, Niels; Sackur, Jerome; Bastian, Mikael; Taatgen, Niels; van Vugt, Marieke; Borst, Jelmer; Mehlhorn, Katja

    2015-01-01

    When we get distracted, we may engage in mind-wandering, or task-unrelated thinking, which impairs performance on cognitive tasks. Yet, we do not have cognitive models that make this process explicit. On the basis of both recent experiments that have started to investigate mind-wandering and

  4. Selecting Tools to Model Integer and Binomial Multiplication

    Science.gov (United States)

    Pratt, Sarah Smitherman; Eddy, Colleen M.

    2017-01-01

    Mathematics teachers frequently provide concrete manipulatives to students during instruction; however, the rationale for using certain manipulatives in conjunction with concepts may not be explored. This article focuses on area models that are currently used in classrooms to provide concrete examples of integer and binomial multiplication. The…

  5. Verifying OCL specifications of UML models : tool support and compositionality

    NARCIS (Netherlands)

    Kyas, Marcel

    2006-01-01

    The Unified Modelling Language (UML) and the Object Constraint Language (OCL) serve as specification languages for embedded and real-time systems used in a safety-critical environment. In this dissertation class diagrams, object diagrams, and OCL constraints are formalised. The formalisation

  6. The Visible Signature Modelling and Evaluation ToolBox

    Science.gov (United States)

    2008-12-01

    Rhinoceros . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 2.5.2 MODTRAN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33...HYDROLIGHT. The commercial support software has a number of dif- ferent functionalities. Rhinoceros provides the wireframe models required as input...Greyscale texture synthesis takes a greyscale input image and a uniform white -noise texture. The white -noise texture is modified to reproduce certain

  7. Computer modelling as a tool for understanding language evolution

    NARCIS (Netherlands)

    de Boer, Bart; Gontier, N; VanBendegem, JP; Aerts, D

    2006-01-01

    This paper describes the uses of computer models in studying the evolution of language. Language is a complex dynamic system that can be studied at the level of the individual and at the level of the population. Much of the dynamics of language evolution and language change occur because of the

  8. Molecular Modeling: A Powerful Tool for Drug Design and Molecular ...

    Indian Academy of Sciences (India)

    data. GENERAL I ARTICLE of programmable calculators (starting around 1956 with the introduction of Fortran), computers as visualization aids (around. 1970) .... ous applications of computer assisted molecular modeling tech- niques are .... thods are less complicated, fast, and are able to handle very large systems ...

  9. Animal models for arthritis: innovative tools for prevention and treatment

    NARCIS (Netherlands)

    Kollias, G.; Papadaki, P.; Apparailly, F.; Vervoordeldonk, M.J.; Holmdahl, R.; Baumans, V.; Desaintes, C.; Di Santo, J.; Distler, J.; Garside, P.; Hegen, M.; Huizinga, T.W.J.; Jüngel, A.; Klareskog, L.; McInnes, I.; Ragoussis, I.; Schett, G.; Hart, B.t.; Tak, P.P.; Toes, R.; van den Berg, W.; Wurst, W.; Gay, S.

    2011-01-01

    The development of novel treatments for rheumatoid arthritis (RA) requires the interplay between clinical observations and studies in animal models. Given the complex molecular pathogenesis and highly heterogeneous clinical picture of RA, there is an urgent need to dissect its multifactorial nature

  10. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  11. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  12. About Using Predictive Models and Tools To Assess Chemicals under TSCA

    Science.gov (United States)

    As part of EPA's effort to promote chemical safety, OPPT provides public access to predictive models and tools which can help inform the public on the hazards and risks of substances and improve chemical management decisions.

  13. The Integrated Medical Model: A Decision Support Tool for In-flight Crew Health Care

    Science.gov (United States)

    Butler, Doug

    2009-01-01

    This viewgraph presentation reviews the development of an Integrated Medical Model (IMM) decision support tool for in-flight crew health care safety. Clinical methods, resources, and case scenarios are also addressed.

  14. An Evaluation of Growth Models as Predictive Tools for Estimates at Completion (EAC)

    National Research Council Canada - National Science Library

    Trahan, Elizabeth N

    2009-01-01

    ...) as the Estimates at Completion (EAC). Our research evaluates the prospect of nonlinear growth modeling as an alternative to the current predictive tools used for calculating EAC, such as the Cost Performance Index (CPI...

  15. Model-Based Design Tools for Extending COTS Components To Extreme Environments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this Phase I project is to prove the feasibility of using model-based design (MBD) tools to predict the performance and useful life of...

  16. Model-Based Design Tools for Extending COTS Components To Extreme Environments, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this project is model-based design (MBD) tools for predicting the performance and useful life of commercial-off-the-shelf (COTS) components and...

  17. Prediction Model and Risk Stratification Tool for Survival in Patients With CKD

    Directory of Open Access Journals (Sweden)

    Alexander S. Goldfarb-Rumyantzev

    2018-03-01

    Conclusion: The risk stratification tool and prediction model of 2-year mortality demonstrated good performance and may be used in clinical practice to quantify the risk of death for individual patients with CKD.

  18. The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods

    NARCIS (Netherlands)

    Verpoorten, Dominique; Poumay, M; Leclercq, D

    2006-01-01

    Please, cite this publication as: Verpoorten, D., Poumay, M., & Leclercq, D. (2006). The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods. Proceedings of International Workshop in Learning Networks for Lifelong Competence Development, TENCompetence

  19. Physics-based Modeling Tools for Life Prediction and Durability Assessment of Advanced Materials, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The technical objectives of this program are: (1) to develop a set of physics-based modeling tools to predict the initiation of hot corrosion and to address pit and...

  20. Multi-Physics Computational Modeling Tool for Materials Damage Assessment, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is to provide a multi-physics modeling tool for materials damage assessment for application to future aircraft design. The software...

  1. Model Verification and Validation Using Graphical Information Systems Tools

    Science.gov (United States)

    2013-07-31

    coastal ocean sufficiently to have a complete picture of the flow. The analysis will thus consist of comparing these incomplete pictures of the current...50 cm. This would suggest that tidal flats would exist at synoptic scales but not daily because there are expanses of the lagoon that are < 50 cm...historical daily data from the correct time of year but not from the correct day. This indicates that the model flow is generally correct at synoptic

  2. 3D model tools for architecture and archaeology reconstruction

    Science.gov (United States)

    Vlad, Ioan; Herban, Ioan Sorin; Stoian, Mircea; Vilceanu, Clara-Beatrice

    2016-06-01

    The main objective of architectural and patrimonial survey is to provide a precise documentation of the status quo of the surveyed objects (monuments, buildings, archaeological object and sites) for preservation and protection, for scientific studies and restoration purposes, for the presentation to the general public. Cultural heritage documentation includes an interdisciplinary approach having as purpose an overall understanding of the object itself and an integration of the information which characterize it. The accuracy and the precision of the model are directly influenced by the quality of the measurements realized on field and by the quality of the software. The software is in the process of continuous development, which brings many improvements. On the other side, compared to aerial photogrammetry, close range photogrammetry and particularly architectural photogrammetry is not limited to vertical photographs with special cameras. The methodology of terrestrial photogrammetry has changed significantly and various photographic acquisitions are widely in use. In this context, the present paper brings forward a comparative study of TLS (Terrestrial Laser Scanner) and digital photogrammetry for 3D modeling. The authors take into account the accuracy of the 3D models obtained, the overall costs involved for each technology and method and the 4th dimension - time. The paper proves its applicability as photogrammetric technologies are nowadays used at a large scale for obtaining the 3D model of cultural heritage objects, efficacious in their assessment and monitoring, thus contributing to historic conservation. Its importance also lies in highlighting the advantages and disadvantages of each method used - very important issue for both the industrial and scientific segment when facing decisions such as in which technology to invest more research and funds.

  3. Variable fused deposition modelling - concept design and tool path generation

    OpenAIRE

    Brooks, Hadley Laurence

    2011-01-01

    Current Fused Deposition Modelling (FDM) techniques use fixed diameter nozzles to deposit a filament of plastic layer by layer. The consequence is that the same small nozzle, essential for fine details, is also used to fill in relatively large volumes. In practice a Pareto-optimal nozzle diameter is chosen that attempts to maximise resolution while minimising build time. This paper introduces a concept for adapting an additive manufacturing system, which exploits a variable diameter nozzle fo...

  4. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  5. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    Science.gov (United States)

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  6. Habitat hydraulic models - a tool for Danish stream quality assessment?

    DEFF Research Database (Denmark)

    Olsen, Martin

    In relation to the European Water Framework Directive (WFD), Danish water management has to change to a holistic management approach considering both groundwaters and surface waters at the same time. Furthermore the WFD introduces the concept "Good ecological status" where the quality of the biol......In relation to the European Water Framework Directive (WFD), Danish water management has to change to a holistic management approach considering both groundwaters and surface waters at the same time. Furthermore the WFD introduces the concept "Good ecological status" where the quality...... in Danish stream management and stream quality assessment. The stream Ledreborg catchment is modelled using a precipitation-run-off-model (NAM) and as an addition to the normal calibration procedure (Kronvang et al., 2000) the model is calibrated using DAISY adjusted evaporation data. The impact from...... groundwater abstraction upon stream discharge is assessed and in relation to this the relative importance of variations in precipitation, evaporation/temperature and groundwater abstraction are discussed. Physical habitat preferences for trout in the stream Ledreborg are assessed through a series of field...

  7. Tools for model-independent bounds in direct dark matter searches

    DEFF Research Database (Denmark)

    Cirelli, M.; Del Nobile, E.; Panci, P.

    2013-01-01

    We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei.......We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei....

  8. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  9. Decision modelling tools for utilities in the deregulated energy market

    Energy Technology Data Exchange (ETDEWEB)

    Makkonen, S. [Process Vision Oy, Helsinki (Finland)

    2005-07-01

    , strategic decision support has also faced new challenges. This thesis introduces two applications involving multiple criteria decision making methods. The first application explores the decision making problem caused by the introduction of 'green' electricity that creates additional value for renewable energy. In this problem the stochastic multicriteria acceptability analysis method (SMAA) is applied. The second strategic multi-criteria decision making study discusses two different energy-related operations research problems: the elements of risk analysis in the energy field and the evaluation of different choices with a decision support tool accommodating incomplete preference information to help energy companies to select a proper risk management system. The application is based on the rank inclusion in criteria hierarchies (RICH) method. (orig.)

  10. Decision modelling tools for utilities in the deregulated energy market

    International Nuclear Information System (INIS)

    Makkonen, S.

    2005-01-01

    , strategic decision support has also faced new challenges. This thesis introduces two applications involving multiple criteria decision making methods. The first application explores the decision making problem caused by the introduction of 'green' electricity that creates additional value for renewable energy. In this problem the stochastic multicriteria acceptability analysis method (SMAA) is applied. The second strategic multi-criteria decision making study discusses two different energy-related operations research problems: the elements of risk analysis in the energy field and the evaluation of different choices with a decision support tool accommodating incomplete preference information to help energy companies to select a proper risk management system. The application is based on the rank inclusion in criteria hierarchies (RICH) method. (orig.)

  11. MetaboTools: A Comprehensive Toolbox for Analysis of Genome-Scale Metabolic Models

    Science.gov (United States)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-01-01

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorials explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. This computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community. PMID:27536246

  12. How can land-use modelling tools inform bioenergy policies?

    Science.gov (United States)

    Davis, Sarah C.; House, Joanna I.; Diaz-Chavez, Rocio A.; Molnar, Andras; Valin, Hugo; DeLucia, Evan H.

    2011-01-01

    Targets for bioenergy have been set worldwide to mitigate climate change. Although feedstock sources are often ambiguous, pledges in European nations, the United States and Brazil amount to more than 100 Mtoe of biorenewable fuel production by 2020. As a consequence, the biofuel sector is developing rapidly, and it is increasingly important to distinguish bioenergy options that can address energy security and greenhouse gas mitigation from those that cannot. This paper evaluates how bioenergy production affects land-use change (LUC), and to what extent land-use modelling can inform sound decision-making. We identified local and global internalities and externalities of biofuel development scenarios, reviewed relevant data sources and modelling approaches, identified sources of controversy about indirect LUC (iLUC) and then suggested a framework for comprehensive assessments of bioenergy. Ultimately, plant biomass must be managed to produce energy in a way that is consistent with the management of food, feed, fibre, timber and environmental services. Bioenergy production provides opportunities for improved energy security, climate mitigation and rural development, but the environmental and social consequences depend on feedstock choices and geographical location. The most desirable solutions for bioenergy production will include policies that incentivize regionally integrated management of diverse resources with low inputs, high yields, co-products, multiple benefits and minimal risks of iLUC. Many integrated assessment models include energy resources, trade, technological development and regional environmental conditions, but do not account for biodiversity and lack detailed data on the location of degraded and underproductive lands that would be ideal for bioenergy production. Specific practices that would maximize the benefits of bioenergy production regionally need to be identified before a global analysis of bioenergy-related LUC can be accomplished. PMID

  13. Modelling as an indispensible research tool in the information society.

    Science.gov (United States)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  14. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  15. Systematic Methods and Tools for Computer Aided Modelling

    OpenAIRE

    Fedorova, Marina; Gani, Rafiqul; Sin, Gürkan

    2015-01-01

    Modeller spiller vigtige roller til design og analyse af kemi- og biokemibaserede produkter samt til processerne, der fremstille dem. Modelbaserede metoder og værktøjer har potentialet til at formindske antallet af eksperimenter, som kan være dyre og tidskrævende, og til at udvælge kandidater, hvorpå den eksperimentelle indsats bør fokuseres. I dette projekt blev en generel modelleringsramme udviklet til en systematisk modelopsætning ved hjælp af modelskabeloner. Modelrammen understøtter en n...

  16. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  17. Tool Support for Collaborative Teaching and Learning of Object-Oriented Modelling

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ratzer, Anne Vinter

    2002-01-01

    Modeling is central to doing and learning object-oriented development. We present a new tool, Ideogramic UML, for gesture-based collaborative modeling with the Unified Modeling Language (UML), which can be used to collaboratively teach and learn modeling. Furthermore, we discuss how we have effec...... effectively used Ideogramic UML to teach object-oriented modeling and the UML to groups of students using the UML for project assignments....

  18. Uranium resources evaluation model as an exploration tool

    International Nuclear Information System (INIS)

    Ruzicka, V.

    1976-01-01

    Evaluation of uranium resources, as conducted by the Uranium Resources Evaluation Section of the Geological Survey of Canada, comprises operations analogous with those performed during the preparatory stages of uranium exploration. The uranium resources evaluation model, simulating the estimation process, can be divided into four steps. The first step includes definition of major areas and ''unit subdivisions'' for which geological data are gathered, coded, computerized and retrieved. Selection of these areas and ''unit subdivisions'' is based on a preliminary appraisal of their favourability for uranium mineralization. The second step includes analyses of the data, definition of factors controlling uranium minearlization, classification of uranium occurrences into genetic types, and final delineation of favourable areas; this step corresponds to the selection of targets for uranium exploration. The third step includes geological field work; it is equivalent to geological reconnaissance in exploration. The fourth step comprises computation of resources; the preliminary evaluation techniques in the exploration are, as a rule, analogous with the simplest methods employed in the resource evaluation. The uranium resources evaluation model can be conceptually applied for decision-making during exploration or for formulation of exploration strategy using the quantified data as weighting factors. (author)

  19. Assessment of the Clinical Trainer as a Role Model: A Role Model Apperception Tool (RoMAT)

    NARCIS (Netherlands)

    Jochemsen-van der Leeuw, H. G. A. Ria; van Dijk, Nynke; Wieringa-de Waard, Margreet

    2014-01-01

    Purpose Positive role modeling by clinical trainers is important for helping trainees learn professional and competent behavior. The authors developed and validated an instrument to assess clinical trainers as role models: the Role Model Apperception Tool (RoMAT). Method On the basis of a 2011

  20. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  1. Network Models: An Underutilized Tool in Wildlife Epidemiology?

    Directory of Open Access Journals (Sweden)

    Meggan E. Craft

    2011-01-01

    Full Text Available Although the approach of contact network epidemiology has been increasing in popularity for studying transmission of infectious diseases in human populations, it has generally been an underutilized approach for investigating disease outbreaks in wildlife populations. In this paper we explore the differences between the type of data that can be collected on human and wildlife populations, provide an update on recent advances that have been made in wildlife epidemiology by using a network approach, and discuss why networks might have been underutilized and why networks could and should be used more in the future. We conclude with ideas for future directions and a call for field biologists and network modelers to engage in more cross-disciplinary collaboration.

  2. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  3. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  4. Model-based development of a course of action scheduling tool

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Mechlenborg, Peter; Zhang, Lin

    2008-01-01

    This paper shows how a formal method in the form of Coloured Petri Nets (CPNs) and the supporting CPN Tools have been used in the development of the Course of Action Scheduling Tool (COAST). The aim of COAST is to support human planners in the specification and scheduling of tasks in a Course...... of Action. CPNs have been used to develop a formal model of the task execution framework underlying COAST. The CPN model has been extracted in executable form from CPN Tools and embedded directly into COAST, thereby automatically bridging the gap between the formal specification and its implementation....... The scheduling capabilities of COAST are based on state space exploration of the embedded CPN model. Planners interact with COAST using a domain-specific graphical user interface (GUI) that hides the embedded CPN model and analysis algorithms. This means that COAST is based on a rigorous semantical model...

  5. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  6. Next-Generation Model-based Variability Management: Languages and Tools

    OpenAIRE

    Acher , Mathieu; Heymans , Patrick; Collet , Philippe; Lahire , Philippe

    2012-01-01

    International audience; Variability modelling and management is a key activity in a growing number of software engineering contexts, from software product lines to dynamic adaptive systems. Feature models are the defacto standard to formally represent and reason about commonality and variability of a software system. This tutorial aims at presenting next generation of feature modelling languages and tools, directly applicable to a wide range of model-based variability problems and application...

  7. NREL Multiphysics Modeling Tools and ISC Device for Designing Safer Li-Ion Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, Ahmad A.; Yang, Chuanbo

    2016-03-24

    The National Renewable Energy Laboratory has developed a portfolio of multiphysics modeling tools to aid battery designers better understand the response of lithium ion batteries to abusive conditions. We will discuss this portfolio, which includes coupled electrical, thermal, chemical, electrochemical, and mechanical modeling. These models can simulate the response of a cell to overheating, overcharge, mechanical deformation, nail penetration, and internal short circuit. Cell-to-cell thermal propagation modeling will be discussed.

  8. Rapid evaluation of machine tools with position-dependent milling stability based on response surface model

    Directory of Open Access Journals (Sweden)

    Li Zhang

    2016-03-01

    Full Text Available The milling stability is one of the important evaluation criterions of dynamic characteristics of machine tools, and it is of great importance for machine tools’ design and manufacturing. The milling stability of machine tools generally varies with the position combinations of moving parts. The traditional milling stability analysis of machine tools is based on some specific positions in the whole workspace of machine tools, and the results are not comprehensive. Furthermore, it is very time-consuming for operation and calculation to complete analysis of multiple positions. A new method to rapidly evaluate the stability of machine tools with position dependence is developed in this article. In this method, the key position combinations of moving parts are set as the samples of calculation to calculate the dynamic characteristics of machine tools with SAMCEF finite element simulation analysis software. Then the minimum critical axial cutting depth of each sample is obtained. The relationship between the position and the value of minimum critical axial cutting depth at any position in the whole workspace can be obtained through established response surface model. The precision of the response surface model is evaluated and the model could be used to rapidly evaluate the milling stability of machine tools with position dependence. With a precision horizontal machining center with box-in-box structure as an example, the value of minimum critical axial cutting depth at any position is shown. This method of rapid evaluation of machine tools with position-dependent stability avoids complicated theoretical calculation, so it can be easily adopted by engineers and technicians in the phase of design process of machine tools.

  9. Error Modeling and Sensitivity Analysis of a Five-Axis Machine Tool

    Directory of Open Access Journals (Sweden)

    Wenjie Tian

    2014-01-01

    Full Text Available Geometric error modeling and its sensitivity analysis are carried out in this paper, which is helpful for precision design of machine tools. Screw theory and rigid body kinematics are used to establish the error model of an RRTTT-type five-axis machine tool, which enables the source errors affecting the compensable and uncompensable pose accuracy of the machine tool to be explicitly separated, thereby providing designers and/or field engineers with an informative guideline for the accuracy improvement by suitable measures, that is, component tolerancing in design, manufacturing, and assembly processes, and error compensation. The sensitivity analysis method is proposed, and the sensitivities of compensable and uncompensable pose accuracies are analyzed. The analysis results will be used for the precision design of the machine tool.

  10. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  11. California Geriatric Education Center Logic Model: An Evaluation and Communication Tool

    Science.gov (United States)

    Price, Rachel M.; Alkema, Gretchen E.; Frank, Janet C.

    2009-01-01

    A logic model is a communications tool that graphically represents a program's resources, activities, priority target audiences for change, and the anticipated outcomes. This article describes the logic model development process undertaken by the California Geriatric Education Center in spring 2008. The CGEC is one of 48 Geriatric Education…

  12. Circumplex Model of Family Systems: A Treatment Tool in Family Counseling.

    Science.gov (United States)

    Maynard, Peter E.; Olson, David H.

    1987-01-01

    Describes the Circumplex Model of Marital and Family Systems and its diagnostic inventory, the Family Adaptation and Coping Evaluation Scales, as important tools for the family counselor. A brief case example demonstrates how the model can be used in counseling a multiproblem family. (NB)

  13. Coloured Petri Nets and CPN Tools for Modelling and Validation of Concurrent Systems

    DEFF Research Database (Denmark)

    Jensen, Kurt; Kristensen, Lars Michael; Wells, Lisa Marie

    2007-01-01

    Coloured Petri Nets (CPNs) is a language for the modeling and validation og systems in which concurrency, communication, and synchronisation play a major role. Coloured Petri Nets is a descrete-event modeling language combining Petri Nets with the funcitonal programming language Standard ML. Petri...... nets provide the doundation of the graphical notation and the basic primitives for modeling concurrency, communication, and synchronisation. Standard ML provides the primitives for the defintion of data types, describing data manipulation, and for creation compact and prarmeterisable models. A CPN...... taken to execute events in the modelled system. CPN Tolls is an industrial-strength computer tool for construction and analysing CPN models. Using CPN Tools, it is possible to investigate the behaviour of the modelled system using simulation, to verify properties by means of state sp0ece methods...

  14. Force Sensor Based Tool Condition Monitoring Using a Heterogeneous Ensemble Learning Model

    Directory of Open Access Journals (Sweden)

    Guofeng Wang

    2014-11-01

    Full Text Available Tool condition monitoring (TCM plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM, hidden Markov model (HMM and radius basis function (RBF are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.

  15. Tool flank wear model and parametric optimization in end milling of metal matrix composite using carbide tool: Response surface methodology approach

    Directory of Open Access Journals (Sweden)

    R. Arokiadass

    2012-04-01

    Full Text Available Highly automated CNC end milling machines in manufacturing industry requires reliable model for prediction of tool flank wear. This model later can be used to predict the tool flank wear (VBmax according to the process parameters. In this investigation an attempt was made to develop an empirical relationship to predict the tool flank wear (VBmax of carbide tools while machining LM25 Al/SiCp incorporating the process parameters such as spindle speed (N, feed rate (f, depth of cut (d and various % wt. of silicon carbide (S. Response surface methodology (RSM was applied to optimizing the end milling process parameters to attain the minimum tool flank wear. Predicted values obtained from the developed model and experimental results are compared, and error <5 percent is observed. In addition, it is concluded that the flank wear increases with the increase of SiCp percentage weight in the MMC.

  16. Thermomechanical modelling of laser surface glazing for H13 tool steel

    Science.gov (United States)

    Kabir, I. R.; Yin, D.; Tamanna, N.; Naher, S.

    2018-03-01

    A two-dimensional thermomechanical finite element (FE) model of laser surface glazing (LSG) has been developed for H13 tool steel. The direct coupling technique of ANSYS 17.2 (APDL) has been utilised to solve the transient thermomechanical process. A H13 tool steel cylindrical cross-section has been modelled for laser power 200 W and 300 W at constant 0.2 mm beam width and 0.15 ms residence time. The model can predict temperature distribution, stress-strain increments in elastic and plastic region with time and space. The crack formation tendency also can be assumed by analysing the von Mises stress in the heat-concentrated zone. Isotropic and kinematic hardening models have been applied separately to predict the after-yield phenomena. At 200 W laser power, the peak surface temperature achieved is 1520 K which is below the melting point (1727 K) of H13 tool steel. For laser power 300 W, the peak surface temperature is 2523 K. Tensile residual stresses on surface have been found after cooling, which are in agreement with literature. Isotropic model shows higher residual stress that increases with laser power. Conversely, kinematic model gives lower residual stress which decreases with laser power. Therefore, both plasticity models could work in LSG for H13 tool steel.

  17. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    Science.gov (United States)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  18. Modelling thermomechanical conditions at the tool/matrix interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper

    2004-01-01

    In friction stir welding the material flow is among others controlled by the contact condition at the tool interface, the thermomechanical state of the matrix and the welding parameters. The conditions under which the deposition process is successful are not fully understood and in most models...... frictional and plastic dissipation. Of special interest is the contact condition along the shoulder/matrix and probe/matrix interfaces, as especially the latter affects the efficiency of the deposition process. The thermo-mechanical state in the workpiece is established by modelling both the dwell and weld...... presented previously in literature, the modelling of the material flow at the tool interface has been prescribed as boundary conditions, i.e. the material is forced to keep contact with the tool. The objective of the present work is to analyse the thermomechanical conditions under which a consolidated weld...

  19. Snoopy's hybrid simulator: a tool to construct and simulate hybrid biological models.

    Science.gov (United States)

    Herajy, Mostafa; Liu, Fei; Rohr, Christian; Heiner, Monika

    2017-07-28

    Hybrid simulation of (computational) biochemical reaction networks, which combines stochastic and deterministic dynamics, is an important direction to tackle future challenges due to complex and multi-scale models. Inherently hybrid computational models of biochemical networks entail two time scales: fast and slow. Therefore, it is intricate to efficiently and accurately analyse them using only either deterministic or stochastic simulation. However, there are only a few software tools that support such an approach. These tools are often limited with respect to the number as well as the functionalities of the provided hybrid simulation algorithms. We present Snoopy's hybrid simulator, an efficient hybrid simulation software which builds on Snoopy, a tool to construct and simulate Petri nets. Snoopy's hybrid simulator provides a wide range of state-of-the-art hybrid simulation algorithms. Using this tool, a computational model of biochemical networks can be constructed using a (coloured) hybrid Petri net's graphical notations, or imported from other compatible formats (e.g. SBML), and afterwards executed via dynamic or static hybrid simulation. Snoopy's hybrid simulator is a platform-independent tool providing an accurate and efficient simulation of hybrid (biological) models. It can be downloaded free of charge as part of Snoopy from http://www-dssz.informatik.tu-cottbus.de/DSSZ/Software/Snoopy .

  20. A Temperature Sensor Clustering Method for Thermal Error Modeling of Heavy Milling Machine Tools

    Directory of Open Access Journals (Sweden)

    Fengchun Li

    2017-01-01

    Full Text Available A clustering method is an effective way to select the proper temperature sensor location for thermal error modeling of machine tools. In this paper, a new temperature sensor clustering method is proposed. By analyzing the characteristics of the temperature of the sensors in a heavy floor-type milling machine tool, an indicator involving both the Euclidean distance and the correlation coefficient was proposed to reflect the differences between temperature sensors, and the indicator was expressed by a distance matrix to be used for hierarchical clustering. Then, the weight coefficient in the distance matrix and the number of the clusters (groups were optimized by a genetic algorithm (GA, and the fitness function of the GA was also rebuilt by establishing the thermal error model at one rotation speed, then deriving its accuracy at two different rotation speeds with a temperature disturbance. Thus, the parameters for clustering, as well as the final selection of the temperature sensors, were derived. Finally, the method proposed in this paper was verified on a machine tool. According to the selected temperature sensors, a thermal error model of the machine tool was established and used to predict the thermal error. The results indicate that the selected temperature sensors can accurately predict thermal error at different rotation speeds, and the proposed temperature sensor clustering method for sensor selection is expected to be used for the thermal error modeling for other machine tools.

  1. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    International Nuclear Information System (INIS)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC

  2. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib, E-mail: garib@mrc-lmb.cam.ac.uk [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge CB2 0QH (United Kingdom)

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC.

  3. Agent-based modeling as a tool for program design and evaluation.

    Science.gov (United States)

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Model and numerical analysis of mechanical phenomena of tools steel hardening

    Directory of Open Access Journals (Sweden)

    A. Bokota

    2010-01-01

    Full Text Available This paper the model hardening of tool steel takes into considerations of mechanical phenomena is presented. Fields stresses and strains are obtained from solutions by FEM equilibrium equations in rate form. The stresses generated during hardening were assumed to result from thermal load, structural deformation, and plastic deformation and transformation plasticity. Thermophysical values in the constitutive relations are depended upon both the temperature and the phase composition. Condition Huber-Misses with the isotropic strengthening for the creation of plastic strains is used. However model Leblond to determined transformations plasticity applied. The analysis of stresses associated of the elements hardening made of tool steel was done.

  5. An axisymmetrical non-linear finite element model for induction heating in injection molding tools

    DEFF Research Database (Denmark)

    Guerrier, Patrick; Nielsen, Kaspar Kirstein; Menotti, Stefano

    2016-01-01

    To analyze the heating and cooling phase of an induction heated injection molding tool accurately, the temperature dependent magnetic properties, namely the non-linear B-H curves, need to be accounted for in an induction heating simulation. Hence, a finite element model has been developed...... in to the injection molding tool. The model shows very good agreement with the experimental temperature measurements. It is also shown that the non-linearity can be used without the temperature dependency in some cases, and a proposed method is presented of how to estimate an effective linear permeability to use...

  6. Analysis, Design, Implementation and Evaluation of Graphical Design Tool to Develop Discrete Event Simulation Models Using Event Graphs and Simkit

    National Research Council Canada - National Science Library

    San

    2001-01-01

    ... (OR) modeling and analysis. However, designing and implementing DES can be a time-consuming and error-prone task, This thesis designed, implemented and evaluated a tool, the Event Graph Graphical Design Tool (EGGDT...

  7. Final Report: Simulation Tools for Parallel Microwave Particle in Cell Modeling

    International Nuclear Information System (INIS)

    Stoltz, Peter H.

    2008-01-01

    Transport of high-power rf fields and the subsequent deposition of rf power into plasma is an important component of developing tokamak fusion energy. Two limitations on rf heating are: (i) breakdown of the metallic structures used to deliver rf power to the plasma, and (ii) a detailed understanding of how rf power couples into a plasma. Computer simulation is a main tool for helping solve both of these problems, but one of the premier tools, VORPAL, is traditionally too difficult to use for non-experts. During this Phase II project, we developed the VorpalView user interface tool. This tool allows Department of Energy researchers a fully graphical interface for analyzing VORPAL output to more easily model rf power delivery and deposition in plasmas.

  8. Clinical Prediction Model and Tool for Assessing Risk of Persistent Pain After Breast Cancer Surgery

    DEFF Research Database (Denmark)

    Meretoja, Tuomo J; Andersen, Kenneth Geving; Bruce, Julie

    2017-01-01

    are missing. The aim was to develop a clinically applicable risk prediction tool. Methods The prediction models were developed and tested using three prospective data sets from Finland (n = 860), Denmark (n = 453), and Scotland (n = 231). Prediction models for persistent pain of moderate to severe intensity......), high body mass index ( P = .039), axillary lymph node dissection ( P = .008), and more severe acute postoperative pain intensity at the seventh postoperative day ( P = .003) predicted persistent pain in the final prediction model, which performed well in the Danish (ROC-AUC, 0.739) and Scottish (ROC......-AUC, 0.740) cohorts. At the 20% risk level, the model had 32.8% and 47.4% sensitivity and 94.4% and 82.4% specificity in the Danish and Scottish cohorts, respectively. Conclusion Our validated prediction models and an online risk calculator provide clinicians and researchers with a simple tool to screen...

  9. Modelling tools to evaluate China's future energy system - a review of the Chinese perspective

    DEFF Research Database (Denmark)

    Mischke, Peggy; Karlsson, Kenneth Bernard

    2014-01-01

    compares 18 energy modelling tools from ten Chinese institutions. These models have been described in English language publications between 2005 and 2013, although not all are published in peer-reviewed journals. When comparing the results for three main energy system indicators across models, this paper...... finds that there are considerable ranges in the reference scenarios: (i) GDP is projected to grow by 630e840% from 2010 to 2050, (ii) energy demand could increase by 200e300% from 2010 to 2050, and (iii) CO2 emissions could rise by 160e250% from 2010 to 2050. Although the access to the modelling tools...... and the underlying data remains challenging, this study concludes that the Chinese perspective, independently from the modelling approach and institution, suggests a rather gradual and long-term transition towards a low carbon economy in China. Few reference scenarios include an emission peak or stabilisation period...

  10. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  11. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Borum, Robert C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chaleff, Ethan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogerson, Doug W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Batteh, John J. [Modelon Corporation (Sweden); Tiller, Michael M. [Xogeny Corporation, Canton, MI (United States)

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  12. Thermal Error Modeling of a Machine Tool Using Data Mining Scheme

    Science.gov (United States)

    Wang, Kun-Chieh; Tseng, Pai-Chang

    In this paper the knowledge discovery technique is used to build an effective and transparent mathematic thermal error model for machine tools. Our proposed thermal error modeling methodology (called KRL) integrates the schemes of K-means theory (KM), rough-set theory (RS), and linear regression model (LR). First, to explore the machine tool's thermal behavior, an integrated system is designed to simultaneously measure the temperature ascents at selected characteristic points and the thermal deformations at spindle nose under suitable real machining conditions. Second, the obtained data are classified by the KM method, further reduced by the RS scheme, and a linear thermal error model is established by the LR technique. To evaluate the performance of our proposed model, an adaptive neural fuzzy inference system (ANFIS) thermal error model is introduced for comparison. Finally, a verification experiment is carried out and results reveal that the proposed KRL model is effective in predicting thermal behavior in machine tools. Our proposed KRL model is transparent, easily understood by users, and can be easily programmed or modified for different machining conditions.

  13. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  14. Bio-logic builder: a non-technical tool for building dynamical, qualitative models.

    Science.gov (United States)

    Helikar, Tomáš; Kowal, Bryan; Madrahimov, Alex; Shrestha, Manish; Pedersen, Jay; Limbu, Kahani; Thapa, Ishwor; Rowley, Thaine; Satalkar, Rahul; Kochi, Naomi; Konvalina, John; Rogers, Jim A

    2012-01-01

    Computational modeling of biological processes is a promising tool in biomedical research. While a large part of its potential lies in the ability to integrate it with laboratory research, modeling currently generally requires a high degree of training in mathematics and/or computer science. To help address this issue, we have developed a web-based tool, Bio-Logic Builder, that enables laboratory scientists to define mathematical representations (based on a discrete formalism) of biological regulatory mechanisms in a modular and non-technical fashion. As part of the user interface, generalized "bio-logic" modules have been defined to provide users with the building blocks for many biological processes. To build/modify computational models, experimentalists provide purely qualitative information about a particular regulatory mechanisms as is generally found in the laboratory. The Bio-Logic Builder subsequently converts the provided information into a mathematical representation described with Boolean expressions/rules. We used this tool to build a number of dynamical models, including a 130-protein large-scale model of signal transduction with over 800 interactions, influenza A replication cycle with 127 species and 200+ interactions, and mammalian and budding yeast cell cycles. We also show that any and all qualitative regulatory mechanisms can be built using this tool.

  15. Design process and tools for dynamic neuromechanical models and robot controllers.

    Science.gov (United States)

    Szczecinski, Nicholas S; Hunt, Alexander J; Quinn, Roger D

    2017-02-01

    We present a serial design process with associated tools to select parameter values for a posture and locomotion controller for simulation of a robot. The controller is constructed from dynamic neuron and synapse models and simulated with the open-source neuromechanical simulator AnimatLab 2. Each joint has a central pattern generator (CPG), whose neurons possess persistent sodium channels. The CPG rhythmically inhibits motor neurons that control the servomotor's velocity. Sensory information coordinates the joints in the leg into a cohesive stepping motion. The parameter value design process is intended to run on a desktop computer, and has three steps. First, our tool FEEDBACKDESIGN uses classical control methods to find neural and synaptic parameter values that stably and robustly control servomotor output. This method is fast, testing over 100 parameter value variations per minute. Next, our tool CPGDESIGN generates bifurcation diagrams and phase response curves for the CPG model. This reveals neural and synaptic parameter values that produce robust oscillation cycles, whose phase can be rapidly entrained to sensory feedback. It also designs the synaptic conductance of inter-joint pathways. Finally, to understand sensitivity to parameters and how descending commands affect a leg's stepping motion, our tool SIMSCAN runs batches of neuromechanical simulations with specified parameter values, which is useful for searching the parameter space of a complicated simulation. These design tools are demonstrated on a simulation of a robot, but may be applied to neuromechanical animal models or physical robots as well.

  16. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    Science.gov (United States)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  17. Development of a surrogate model for elemental analysis using a natural gamma ray spectroscopy tool

    International Nuclear Information System (INIS)

    Zhang, Qiong

    2015-01-01

    A systematic computational method for obtaining accurate elemental standards efficiently for varying borehole conditions was developed based on Monte Carlo simulations, surrogate modeling, and data assimilation. Elemental standards are essential for spectral unfolding in formation evaluation applications commonly used for nuclear well logging tools. Typically, elemental standards are obtained by standardized measurements, but these experiments are expensive and lack the flexibility to address different logging conditions. In contrast, computer-based Monte Carlo simulations provide an accurate and more flexible approach to obtaining elemental standards for formation evaluation. The presented computational method recognizes that in contrast to typical neutron–photon simulations, where the source is typically artificial and well characterized (Galford, 2009), an accurate knowledge of the source is essential for matching the obtained Monte Carlo elemental standards with their experimental counterparts. Therefore, source distributions are adjusted to minimize the L2 difference of the Monte Carlo computed and experimental standards. Subsequently, an accurate surrogate model is developed accounting for different casing and cement thicknesses, and tool positions within the borehole. The adjusted source distributions are then utilized to generate and validate spectra for varying borehole conditions: tool position, casing and cement thickness. The effect of these conditions on the spectra are investigated and discussed in this work. Given that Monte Carlo modeling provides much lower cost and more flexibility, employing Monte Carlo could enhance the processing of nuclear tool logging data computed standards. - Highlights: • A novel computational model for efficiently computing elemental standards for varying borehole conditions has been developed. • A model of an experimental test pit was implemented in the Monte Carlo code GEANT4 for computing elemental standards.

  18. Modelling of the Contact Condition at the Tool/Matrix Interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper; Wert, John

    2003-01-01

    The objective of the present paper is to investigate the heat generation and contact condition during Friction Stir Welding (FSW). For this purpose, an analytical model is developed for the heat generation and this is combined with a Eulerian FE-analysis of the temperature field. The heat...... generation is closely related to the friction condition at the contact interface between the FSW tool and the weld piece material as well as the material flow in the weld matrix, since the mechanisms for heat generation by frictional and plastic dissipation are different. The heat generation from the tool...... is governed by the contact condition, i.e. whether there is sliding, sticking or partial sliding/sticking. The contact condition in FSW is complex (dependent on alloy, welding parameters, tool design etc.), and previous models (both analytical and numerical) for simulation of the heat generation assume...

  19. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Rode, Carsten; Grau, Karl

    2009-01-01

    cavity such as in the exterior cladding of building envelopes, i.e. a flow which is parallel to the construction plane. 2. Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the construction plane. The new models make it possible to predict the thermal......Simulation tools are becoming available which predict the heat and moisture conditions in the indoor environment as well as in the envelope of buildings, and thus it has become possible to consider the important interaction between the different components of buildings and the different physical...... phenomena which occur. However, there is still room for further development of such tools. This paper will present an attempt to integrate modelling of air flows in building envelopes into a whole building hygrothermal simulation tool. Two kinds of air flows have been considered: 1. Air flow in ventilated...

  20. Implementation of Models for Building Envelope Air Flow Fields in a Whole Building Hygrothermal Simulation Tool

    DEFF Research Database (Denmark)

    Sørensen, Karl Grau; Rode, Carsten

    2009-01-01

    phenomena that occur. However, there is still room for further development of such tools. This paper will present an attempt to integrate modelling of air flows in building envelopes into a whole building hygrothermal simulation tool. Two kinds of air flows have been considered: (1) Air flow in a ventilated...... cavity such as behind the exterior cladding of a building envelope, i.e. a flow which is parallel to the construction plane. (2) Infiltration/exfiltration of air through the building envelope, i.e. a flow which is perpendicular to the constructionplane. The paper presents the models and how they have......Simulation tools are becoming available which predict the heat and moisture conditions in the indoor environment as well as in the envelope of buildings, and thus it has become possible to consider the important interaction between the different components of buildings and the different physical...

  1. Thermal Error Test and Intelligent Modeling Research on the Spindle of High Speed CNC Machine Tools

    Science.gov (United States)

    Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu

    2018-03-01

    Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.

  2. Spindle Thermal Error Optimization Modeling of a Five-axis Machine Tool

    Science.gov (United States)

    Guo, Qianjian; Fan, Shuo; Xu, Rufeng; Cheng, Xiang; Zhao, Guoyong; Yang, Jianguo

    2017-05-01

    Aiming at the problem of low machining accuracy and uncontrollable thermal errors of NC machine tools, spindle thermal error measurement, modeling and compensation of a two turntable five-axis machine tool are researched. Measurement experiment of heat sources and thermal errors are carried out, and GRA(grey relational analysis) method is introduced into the selection of temperature variables used for thermal error modeling. In order to analyze the influence of different heat sources on spindle thermal errors, an ANN (artificial neural network) model is presented, and ABC(artificial bee colony) algorithm is introduced to train the link weights of ANN, a new ABC-NN(Artificial bee colony-based neural network) modeling method is proposed and used in the prediction of spindle thermal errors. In order to test the prediction performance of ABC-NN model, an experiment system is developed, the prediction results of LSR (least squares regression), ANN and ABC-NN are compared with the measurement results of spindle thermal errors. Experiment results show that the prediction accuracy of ABC-NN model is higher than LSR and ANN, and the residual error is smaller than 3 μm, the new modeling method is feasible. The proposed research provides instruction to compensate thermal errors and improve machining accuracy of NC machine tools.

  3. Reach adaptation: what determines whether we learn an internal model of the tool or adapt the model of our arm?

    Science.gov (United States)

    Kluzik, JoAnn; Diedrichsen, Jörn; Shadmehr, Reza; Bastian, Amy J

    2008-09-01

    We make errors when learning to use a new tool. However, the cause of error may be ambiguous: is it because we misestimated properties of the tool or of our own arm? We considered a well-studied adaptation task in which people made goal-directed reaching movements while holding the handle of a robotic arm. The robot produced viscous forces that perturbed reach trajectories. As reaching improved with practice, did people recalibrate an internal model of their arm, or did they build an internal model of the novel tool (robot), or both? What factors influenced how the brain solved this credit assignment problem? To investigate these questions, we compared transfer of adaptation between three conditions: catch trials in which robot forces were turned off unannounced, robot-null trials in which subjects were told that forces were turned off, and free-space trials in which subjects still held the handle but watched as it was detached from the robot. Transfer to free space was 40% of that observed in unannounced catch trials. We next hypothesized that transfer to free space might increase if the training field changed gradually, rather than abruptly. Indeed, this method increased transfer to free space from 40 to 60%. Therefore although practice with a novel tool resulted in formation of an internal model of the tool, it also appeared to produce a transient change in the internal model of the subject's arm. Gradual changes in the tool's dynamics increased the extent to which the nervous system recalibrated the model of the subject's own arm.

  4. Flexible global ocean-atmosphere-land system model. A modeling tool for the climate change research community

    International Nuclear Information System (INIS)

    Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin

    2014-01-01

    First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.

  5. GAMBIT. The global and modular beyond-the-standard-model inference tool

    International Nuclear Information System (INIS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are; Buckley, Andy; Chrzaszcz, Marcin; Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan; Cornell, Jonathan M.; Dickinson, Hugh; Jackson, Paul; White, Martin; Kvellestad, Anders; Savage, Christopher; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; Wild, Sebastian

    2017-01-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  6. GAMBIT. The global and modular beyond-the-standard-model inference tool

    Energy Technology Data Exchange (ETDEWEB)

    Athron, Peter; Balazs, Csaba [Monash University, School of Physics and Astronomy, Melbourne, VIC (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are [University of Oslo, Department of Physics, Oslo (Norway); Buckley, Andy [University of Glasgow, SUPA, School of Physics and Astronomy, Glasgow (United Kingdom); Chrzaszcz, Marcin [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Polish Academy of Sciences, H. Niewodniczanski Institute of Nuclear Physics, Krakow (Poland); Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Cornell, Jonathan M. [McGill University, Department of Physics, Montreal, QC (Canada); Dickinson, Hugh [University of Minnesota, Minnesota Institute for Astrophysics, Minneapolis, MN (United States); Jackson, Paul; White, Martin [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); University of Adelaide, Department of Physics, Adelaide, SA (Australia); Kvellestad, Anders; Savage, Christopher [NORDITA, Stockholm (Sweden); McKay, James [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Mahmoudi, Farvah [Univ Lyon, Univ Lyon 1, ENS de Lyon, CNRS, Centre de Recherche Astrophysique de Lyon UMR5574, Saint-Genis-Laval (France); CERN, Theoretical Physics Department, Geneva (Switzerland); Martinez, Gregory D. [University of California, Physics and Astronomy Department, Los Angeles, CA (United States); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Ripken, Joachim [Max Planck Institute for Solar System Research, Goettingen (Germany); Rogan, Christopher [Harvard University, Department of Physics, Cambridge, MA (United States); Saavedra, Aldo [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); The University of Sydney, Faculty of Engineering and Information Technologies, Centre for Translational Data Science, School of Physics, Sydney, NSW (Australia); Scott, Pat [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Seo, Seon-Hee [Seoul National University, Department of Physics and Astronomy, Seoul (Korea, Republic of); Serra, Nicola [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); Wild, Sebastian [DESY, Hamburg (Germany); Collaboration: The GAMBIT Collaboration

    2017-11-15

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  7. New tools in modulating Maillard reaction from model systems to food

    NARCIS (Netherlands)

    Troise, A.D.

    2015-01-01

    New tools in modulating Maillard reaction from model systems to food
    The Maillard reaction (MR) supervises the final quality of foods and occupies a prominent place in food science. The first stable compounds, the Amadori rearrangement products

  8. Using Model-Eliciting Activities as a Tool to Identify and Develop Mathematically Creative Students

    Science.gov (United States)

    Coxbill, Emmy; Chamberlin, Scott A.; Weatherford, Jennifer

    2013-01-01

    Traditional classroom methods for identifying mathematically creative students have been inadequate. Identifying students who could potentially be mathematically creative is instrumental in the development of students and in meeting their affective and educational needs. One prospective identification tool is the use of model-eliciting activities…

  9. A new framework for modeling decentralized low impact developments using Soil and Water Assessment Tool

    Science.gov (United States)

    Assessing the performance of Low Impact Development (LID) practices at a catchment scale is important in managing urban watersheds. Few modeling tools exist that are capable of explicitly representing the hydrological mechanisms of LIDs while considering the diverse land uses of urban watersheds. ...

  10. OMNIITOX - operational life-cycle impact assessment models and information tools for practitioners

    DEFF Research Database (Denmark)

    Molander, S; Lidholm, Peter; Schowanek, Diederik

    2004-01-01

    This article is the preamble to a set of articles describing initial results from an on-going European Commission funded, 5th Framework project called OMNIITOX, Operational Models aNd Information tools for Industrial applications of eco/TOXicological impact assessments. The different parts of thi...

  11. Recommender System and Web 2.0 Tools to Enhance a Blended Learning Model

    Science.gov (United States)

    Hoic-Bozic, Natasa; Dlab, Martina Holenko; Mornar, Vedran

    2016-01-01

    Blended learning models that combine face-to-face and online learning are of great importance in modern higher education. However, their development should be in line with the recent changes in e-learning that emphasize a student-centered approach and use tools available on the Web to support the learning process. This paper presents research on…

  12. Towards Semantically Integrated Models and Tools for Cyber-Physical Systems Design

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Fitzgerald, John; Woodcock, Jim

    2016-01-01

    We describe an approach to the model-based engineering of embedded and cyber-physical systems, based on the semantic integration of diverse discipline-specific notations and tools. Using the example of a small unmanned aerial vehicle, we explain the need for multiple notations and collaborative...

  13. An online tool for business modelling and a refinement of the Business Canvas

    NARCIS (Netherlands)

    Rogier Brussee; Peter de Groot

    2016-01-01

    We give a refinement of the well known business model canvas by Osterwalder and Pigneur by splitting the basic blocks into further subblocks to reduce confusion and increase its expressive power. The splitting is used in an online tool which in addition comes with a set of questions to further

  14. Toward Enhancing Automated Credibility Assessment: A Model for Question Type Classification and Tools for Linguistic Analysis

    Science.gov (United States)

    Moffitt, Kevin Christopher

    2011-01-01

    The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…

  15. Simulation of Forming Process as an Educational Tool Using Physical Modeling

    Science.gov (United States)

    Abdullah, A. B.; Muda, M. R.; Samad, Z.

    2008-01-01

    Metal forming process simulation requires a very high cost including the cost for dies, machine and material and tight process control since the process involve very huge pressure. A physical modeling technique is developed and initiates a new era of educational tool of simulating the process effectively. Several publications and findings have…

  16. Using the Cognitive Apprenticeship Model with a Chat Tool to Enhance Online Collaborative Learning

    Science.gov (United States)

    Rodríguez-Bonces, Mónica; Ortiz, Kris

    2016-01-01

    In Colombia, many institutions are in the firm quest of virtual learning environments to improve instruction, and making the most of online tools is clearly linked to offering quality learning. Thus, the purpose of this action research was to identify how the Cognitive Apprenticeship Model enhances online collaborative learning by using a chat…

  17. What's new in the Atmospheric Model Evaluation Tool (AMET) version 1.3

    Science.gov (United States)

    A new version of the Atmospheric Model Evaluation Tool (AMET) has been released. The new version of AMET, version 1.3 (AMETv1.3), contains a number of updates and changes from the previous of version of AMET (v1.2) released in 2012. First, the Perl scripts used in the previous ve...

  18. Exposure Modeling Tools and Databases for Consideration for Relevance to the Amended TSCA (ISES)

    Science.gov (United States)

    The Agency’s Office of Research and Development (ORD) has a number of ongoing exposure modeling tools and databases. These efforts are anticipated to be useful in supporting ongoing implementation of the amended Toxic Substances Control Act (TSCA). Under ORD’s Chemic...

  19. Validation of Multiple Tools for Flat Plate Photovoltaic Modeling Against Measured Data

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Blair, N.; Dobos, A. P.

    2014-08-01

    This report expands upon a previous work by the same authors, published in the 40th IEEE Photovoltaic Specialists conference. In this validation study, comprehensive analysis is performed on nine photovoltaic systems for which NREL could obtain detailed performance data and specifications, including three utility-scale systems and six commercial scale systems. Multiple photovoltaic performance modeling tools were used to model these nine systems, and the error of each tool was analyzed compared to quality-controlled measured performance data. This study shows that, excluding identified outliers, all tools achieve annual errors within +/-8% and hourly root mean squared errors less than 7% for all systems. It is further shown using SAM that module model and irradiance input choices can change the annual error with respect to measured data by as much as 6.6% for these nine systems, although all combinations examined still fall within an annual error range of +/-8.5%. Additionally, a seasonal variation in monthly error is shown for all tools. Finally, the effects of irradiance data uncertainty and the use of default loss assumptions on annual error are explored, and two approaches to reduce the error inherent in photovoltaic modeling are proposed.

  20. Characterization and Modeling of Insect Swarms Using tools from Fluid Dynamics

    Science.gov (United States)

    2016-09-01

    Characterization and Modeling of Insect Swarms Using tools from Fluid Dynamics The goals of this project were to develop a laboratory system for...quantitatively measuring the flight trajectories of swarming insects and to use the resulting data to evaluate currently used models of collective...behavior. We were successful in completing both goals, leading to the first highly resolved, statistically robust data sets for insect swarms, which we

  1. Visual Representation in GENESIS as a tool for Physical Modeling, Sound Synthesis and Musical Composition

    OpenAIRE

    Villeneuve, Jérôme; Cadoz, Claude; Castagné, Nicolas

    2015-01-01

    The motivation of this paper is to highlight the importance of visual representations for artists when modeling and simulating mass-interaction physical networks in the context of sound synthesis and musical composition. GENESIS is a musician-oriented software environment for sound synthesis and musical composition. However, despite this orientation, a substantial amount of effort has been put into building a rich variety of tools based on static or dynamic visual representations of models an...

  2. Rogeaulito: A World Energy Scenario Modeling Tool for Transparent Energy System Thinking

    International Nuclear Information System (INIS)

    Benichou, Léo; Mayr, Sebastian

    2014-01-01

    Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic, and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  3. Hot metal temperature prediction in blast furnace using advanced model based on fuzzy logic tools

    Energy Technology Data Exchange (ETDEWEB)

    Martin, R.D.; Obeso, F.; Mochon, J.; Barea, R.; Jimenez, J.

    2007-05-15

    The present work presents a model based on fuzzy logic tools to predict and simulate the hot metal temperature in a blast furnace (BF). As input variables this model uses the control variables of a current BF such as moisture, pulverised coal injection, oxygen addition, mineral/coke ratio and blast volume, and it yields as a result of the hot metal temperature. The variables employed to develop the model have been obtained from data supplied by current sensors of a Spanish BF In the model training stage the adaptive neurofuzzy inference system and the subtractive clustering algorithms have been used.

  4. FORMAL MODELLING OF BUSINESS RULES: WHAT KIND OF TOOL TO USE?

    Directory of Open Access Journals (Sweden)

    Sandra Lovrenčić

    2006-12-01

    Full Text Available Business rules are today essential parts of a business system model. But presently, there are still various approaches to, definitions and classifications of this concept. Similarly, there are also different approaches in business rules formalization and implementation. This paper investigates formalization using formal language in association with easy domain modelling. Two of the tools that enable such approach are described and compared according to several factors. They represent ontology modelling and UML, nowadays widely used standard for object-oriented modelling. A simple example is also presented.

  5. Graphical and numerical diagnostic tools to assess suitability of multiple imputations and imputation models.

    Science.gov (United States)

    Bondarenko, Irina; Raghunathan, Trivellore

    2016-07-30

    Multiple imputation has become a popular approach for analyzing incomplete data. Many software packages are available to multiply impute the missing values and to analyze the resulting completed data sets. However, diagnostic tools to check the validity of the imputations are limited, and the majority of the currently available methods need considerable knowledge of the imputation model. In many practical settings, however, the imputer and the analyst may be different individuals or from different organizations, and the analyst model may or may not be congenial to the model used by the imputer. This article develops and evaluates a set of graphical and numerical diagnostic tools for two practical purposes: (i) for an analyst to determine whether the imputations are reasonable under his/her model assumptions without actually knowing the imputation model assumptions; and (ii) for an imputer to fine tune the imputation model by checking the key characteristics of the observed and imputed values. The tools are based on the numerical and graphical comparisons of the distributions of the observed and imputed values conditional on the propensity of response. The methodology is illustrated using simulated data sets created under a variety of scenarios. The examples focus on continuous and binary variables, but the principles can be used to extend methods for other types of variables. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Multi-Model R-Tool for uncertainty assessment in landslides susceptibility analysis

    Science.gov (United States)

    Cosmin Sandric, Ionut; Chitu, Zenaida; Jurchescu, Marta; Micu, Mihai

    2014-05-01

    The evaluation of landslide susceptibility requires understanding of the spatial distribution of the factors that control slope instability. It is known that the behavior of landslides is difficult to evaluate because of the various factors that trigger mass movements. The methodology used is very diverse, based on statistical methods, probabilistic methods, deterministic methods, empirical methods or a combination of them and the main factors used for landslide susceptibility assessment are composed from basic morphometric parameters, such as slope gradient, curvature, aspect, solar radiation etc. in combination with lithology, land-use/land-cover, soil types or soil properties. The reliability of susceptibility maps is mostly estimated by a comparison with ground truth and visualized as charts and statistical tables and less by maps for landslides susceptibility uncertainty. Due to similarity of inputs required by numerous susceptibility models, we have developed a Multi-Model tool for R, a free software environment for statistical computing and graphics, combines several landslides susceptibility models into one forecast, thereby improving the forecast accuracy even further. The tool uses as inputs all the predisposing factors and generates susceptibility maps for each model; it combines the resulted susceptibility maps in just one and assesses the uncertainty as a function of susceptibility levels from each map. The final results are susceptibility and uncertainty maps as a function of several susceptibility models. The Multi-Model R-Tool was tested in different areas from Romanian Subcarpathians with very good results

  7. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  8. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  9. A MASTER THESIS ON PORTING THE ENTERPRISE ARCHITECTURE ANALYSIS TOOL TO ECLIPSE MODELING PROJECT

    OpenAIRE

    Ivanov, Stanislav

    2011-01-01

    This master thesis is a part of the ongoing research on EAT development project. Its main goal is to research whether Eclipse Modeling Project can be used as an alternative platform to using NetBeans in implementing EAT tool. In order to fulfill this goal, it contains analysis of the current EAT tool version and design research of a new version using EMP. The design addresses most of the issues related to building a new version and eventually recommends porting EAT to EMP.

  10. Learning how to use a tool: Mutually exclusive tool-function mappings are selectively acquired from linguistic in-group models.

    Science.gov (United States)

    Pető, Réka; Elekes, Fruzsina; Oláh, Katalin; Király, Ildikó

    2018-07-01

    The current study investigated whether 4-year-olds used language as a cue to social group membership to infer whether the tool-use behavior of a model needed to be encoded as indicative of the tool's function. We built on children's tendency to treat functions as mutually exclusive, that is, their propensity to refrain from using the same tool for more than one function. We hypothesized that children would form mutually exclusive tool-function mappings only if the source of the function information was a linguistic in-group person (native) as opposed to an out-group (foreign) person. In Experiment 1, participants (N = 39) were presented with four tool-function pairs by a model who had previously spoken either in their native language or in a foreign language. During the test phase, children encountered new purposes for which they could either use the demonstrated tools' color variant or use another equally suitable, as yet unseen, alternative tool. In line with our predictions, children preferred to use the alternative tool for the new function only in the native language condition (native: 63.3%; foreign: 42.7%). Experiment 2 replicated the initial finding using another foreign language and demonstrated that the lack of mutually exclusive tool choice in the foreign condition did not originate from children's failure to encode the demonstration. These findings suggest that children restrict learning artifact functions from linguistic in-group models. The mutual exclusivity principle in the domain of function learning is used more flexibly than previously proposed. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Directory of Open Access Journals (Sweden)

    Okokpujie Imhade Princess

    2017-12-01

    Full Text Available In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N, feed rate (f, axial depth of cut (a and radial depth of cut (r. The experiment was designed using central composite design (CCD in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM. The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  12. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Science.gov (United States)

    Okokpujie, Imhade Princess; Ikumapayi, Omolayo M.; Okonkwo, Ugochukwu C.; Salawu, Enesi Y.; Afolalu, Sunday A.; Dirisu, Joseph O.; Nwoke, Obinna N.; Ajayi, Oluseyi O.

    2017-12-01

    In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N), feed rate (f), axial depth of cut (a) and radial depth of cut (r). The experiment was designed using central composite design (CCD) in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM). The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  13. Modeling and evaluation of the influence of micro-EDM sparking state settings on the tool electrode wear behavior

    DEFF Research Database (Denmark)

    Puthumana, Govindan

    2017-01-01

    materials characterized by considerable wear ofthe tool used for material removal. This paper presents an investigation involving modeling and estimation of the effect of settings for generation of discharges in stable conditions of micro-EDM on the phenomenon of tool electrode wear. A stable sparking...... a condition for the minimum tool wear for this micro-EDM process configuration....

  14. Techniques for the construction of an elliptical-cylindrical model using circular rotating tools in non CNC machines

    International Nuclear Information System (INIS)

    Villalobos Mendoza, Brenda; Cordero Davila, Alberto; Gonzalez Garcia, Jorge

    2011-01-01

    This paper describes the construction of an elliptical-cylindrical model without spherical aberration using vertical rotating tools. The engine of the circular tool is placed on one arm so that the tool fits on the surface and this in turn is moved by an X-Y table. The test method and computer algorithms that predict the desired wear are described.

  15. Dynamic wind turbine models in power system simulation tool DIgSILENT

    DEFF Research Database (Denmark)

    Hansen, A.D.; Jauch, C.; Sørensen, Poul Ejnar

    2004-01-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create amodel database in different simulation tools....... Active stall wind turbine withinduction generator 2. Variable speed, variable pitch wind turbine with doubly-fed induction generator These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, controlstrategies, connection...... of the wind turbine at different types of grid and storage systems. For both these two concepts, control strategies are developed and implemented, their performance assessed and discussed by means of simulations....

  16. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    Science.gov (United States)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  17. Modelling tools for assessing bioremediation performance and risk of chlorinated solvents in clay tills

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia

    of future remediation of chlorinated ethenes in low-permeability settings. In conclusion, this PhD-project has developed our understanding on transport and degradation processes of chlorinated solvents in clay tills, and this knowledge was used to develop modelling tools for assessment of risk...... are trapped in the low-permeability matrix and can then slowly back diffuse to the fracture network, forming a long-term secondary contamination source to the underlying aquifers. Because of the complex transport and degradation processes and the mass transfer limitations, risk assessment and remediation...... design are challenging. This thesis presents the development and application of analytical and numerical models to improve our understanding of transport and degradation processes in clay tills, which is crucial for assessing bioremediation performance and risk to groundwater. A set of modelling tools...

  18. A Distributed Electrochemistry Modeling Tool for Simulating SOFC Performance and Degradation

    Energy Technology Data Exchange (ETDEWEB)

    Recknagle, Kurtis P.; Ryan, Emily M.; Khaleel, Mohammad A.

    2011-10-13

    This report presents a distributed electrochemistry (DEC) model capable of investigating the electrochemistry and local conditions with the SOFC MEA based on the local microstructure and multi-physics. The DEC model can calculate the global current-voltage (I-V) performance of the cell as determined by the spatially varying local conditions through the thickness of the electrodes and electrolyte. The simulation tool is able to investigate the electrochemical performance based on characteristics of the electrode microstructure, such as particle size, pore size, electrolyte and electrode phase volume fractions, and triple-phase-boundary length. It can also investigate performance as affected by fuel and oxidant gas flow distributions and other environmental/experimental conditions such as temperature and fuel gas composition. The long-term objective for the DEC modeling tool is to investigate factors that cause electrode degradation and the decay of SOFC performance which decrease longevity.

  19. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  20. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    Science.gov (United States)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  1. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process

    Directory of Open Access Journals (Sweden)

    Pablo Araujo Granda

    2016-01-01

    Full Text Available Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4+, NO3−, NO2−, N2 to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc. MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  2. Applications and issues of GIS as tool for civil engineering modeling

    Science.gov (United States)

    Miles, S.B.; Ho, C.L.

    1999-01-01

    A tool that has proliferated within civil engineering in recent years is geographic information systems (GIS). The goal of a tool is to supplement ability and knowledge that already exists, not to serve as a replacement for that which is lacking. To secure the benefits and avoid misuse of a burgeoning tool, engineers must understand the limitations, alternatives, and context of the tool. The common benefits of using GIS as a supplement to engineering modeling are summarized. Several brief case studies of GIS modeling applications are taken from popular civil engineering literature to demonstrate the wide use and varied implementation of GIS across the discipline. Drawing from the case studies, limitations regarding traditional GIS data models find the implementation of civil engineering models within current GIS are identified and countered by discussing the direction of the next generation of GIS. The paper concludes by highlighting the potential for the misuse of GIS in the context of engineering modeling and suggests that this potential can be reduced through education and awareness. The goal of this paper is to promote awareness of the issues related to GIS-based modeling and to assist in the formulation of questions regarding the application of current GIS. The technology has experienced much publicity of late, with many engineers being perhaps too excited about the usefulness of current GIS. An undoubtedly beneficial side effect of this, however, is that engineers are becoming more aware of GIS and, hopefully, the associated subtleties. Civil engineers must stay informed of GIS issues and progress, but more importantly, civil engineers must inform the GIS community to direct the technology development optimally.

  3. Hanford River Protection Project Life cycle Cost Modeling Tool to Enhance Mission Planning - 13396

    International Nuclear Information System (INIS)

    Dunford, Gary; Williams, David; Smith, Rick

    2013-01-01

    The Life cycle Cost Model (LCM) Tool is an overall systems model that incorporates budget, and schedule impacts for the entire life cycle of the River Protection Project (RPP) mission, and is replacing the Hanford Tank Waste Operations Simulator (HTWOS) model as the foundation of the RPP system planning process. Currently, the DOE frequently requests HTWOS simulations of alternative technical and programmatic strategies for completing the RPP mission. Analysis of technical and programmatic changes can be performed with HTWOS; however, life cycle costs and schedules were previously generated by manual transfer of time-based data from HTWOS to Primavera P6. The LCM Tool automates the preparation of life cycle costs and schedules and is needed to provide timely turnaround capability for RPP mission alternative analyses. LCM is the simulation component of the LCM Tool. The simulation component is a replacement of the HTWOS model with new capability to support life cycle cost modeling. It is currently deployed in G22, but has been designed to work in any full object-oriented language with an extensive feature set focused on networking and cross-platform compatibility. The LCM retains existing HTWOS functionality needed to support system planning and alternatives studies going forward. In addition, it incorporates new functionality, coding improvements that streamline programming and model maintenance, and capability to input/export data to/from the LCM using the LCM Database (LCMDB). The LCM Cost/Schedule (LCMCS) contains cost and schedule data and logic. The LCMCS is used to generate life cycle costs and schedules for waste retrieval and processing scenarios. It uses time-based output data from the LCM to produce the logic ties in Primavera P6 necessary for shifting activities. The LCM Tool is evolving to address the needs of decision makers who want to understand the broad spectrum of risks facing complex organizations like DOE-RPP to understand how near

  4. CMS Partial Releases Model, Tools, and Applications. Online and Framework-Light Releases

    CERN Document Server

    Jones, Christopher D; Meschi, Emilio; Shahzad Muzaffar; Andreas Pfeiffer; Ratnikova, Natalia; Sexton-Kennedy, Elizabeth

    2009-01-01

    The CMS Software project CMSSW embraces more than a thousand packages organized in subsystems for analysis, event display, reconstruction, simulation, detector description, data formats, framework, utilities and tools. The release integration process is highly automated by using tools developed or adopted by CMS. Packaging in rpm format is a built-in step in the software build process. For several well-defined applications it is highly desirable to have only a subset of the CMSSW full package bundle. For example, High Level Trigger algorithms that run on the Online farm, and need to be rebuilt in a special way, require no simulation, event display, or analysis packages. Physics analysis applications in Root environment require only a few core libraries and the description of CMS specific data formats. We present a model of CMS Partial Releases, used for preparation of the customized CMS software builds, including description of the tools used, the implementation, and how we deal with technical challenges, suc...

  5. PredicT-ML: a tool for automating machine learning model building with big clinical data.

    Science.gov (United States)

    Luo, Gang

    2016-01-01

    Predictive modeling is fundamental to transforming large clinical data sets, or "big clinical data," into actionable knowledge for various healthcare applications. Machine learning is a major predictive modeling approach, but two barriers make its use in healthcare challenging. First, a machine learning tool user must choose an algorithm and assign one or more model parameters called hyper-parameters before model training. The algorithm and hyper-parameter values used typically impact model accuracy by over 40 %, but their selection requires many labor-intensive manual iterations that can be difficult even for computer scientists. Second, many clinical attributes are repeatedly recorded over time, requiring temporal aggregation before predictive modeling can be performed. Many labor-intensive manual iterations are required to identify a good pair of aggregation period and operator for each clinical attribute. Both barriers result in time and human resource bottlenecks, and preclude healthcare administrators and researchers from asking a series of what-if questions when probing opportunities to use predictive models to improve outcomes and reduce costs. This paper describes our design of and vision for PredicT-ML (prediction tool using machine learning), a software system that aims to overcome these barriers and automate machine learning model building with big clinical data. The paper presents the detailed design of PredicT-ML. PredicT-ML will open the use of big clinical data to thousands of healthcare administrators and researchers and increase the ability to advance clinical research and improve healthcare.

  6. Tools for Resilience Management: Multidisciplinary Development of State-and-Transition Models for Northwest Colorado

    Directory of Open Access Journals (Sweden)

    Emily J. Kachergis

    2013-12-01

    Full Text Available Building models is an important way of integrating knowledge. Testing and updating models of social-ecological systems can inform management decisions and, ultimately, improve resilience. We report on the outcomes of a six-year, multidisciplinary model development process in the sagebrush steppe, USA. We focused on creating state-and-transition models (STMs, conceptual models of ecosystem change that represent nonlinear dynamics and are being adopted worldwide as tools for managing ecosystems. STM development occurred in four steps with four distinct sets of models: (1 local knowledge elicitation using semistructured interviews; (2 ecological data collection using an observational study; (3 model integration using participatory workshops; and (4 model simplification upon review of the literature by a multidisciplinary team. We found that different knowledge types are ultimately complementary. Many of the benefits of the STM-building process flowed from the knowledge integration steps, including improved communication, identification of uncertainties, and production of more broadly credible STMs that can be applied in diverse situations. The STM development process also generated hypotheses about sagebrush steppe dynamics that could be tested by future adaptive management and research. We conclude that multidisciplinary development of STMs has great potential for producing credible, useful tools for managing resilience of social-ecological systems. Based on this experience, we outline a streamlined, participatory STM development process that integrates multiple types of knowledge and incorporates adaptive management.

  7. Identifying a minimal rheological configuration: a tool for effective and efficient constitutive modeling of soft tissues.

    Science.gov (United States)

    Jordan, Petr; Kerdok, Amy E; Howe, Robert D; Socrate, Simona

    2011-04-01

    We describe a modeling methodology intended as a preliminary step in the identification of appropriate constitutive frameworks for the time-dependent response of biological tissues. The modeling approach comprises a customizable rheological network of viscous and elastic elements governed by user-defined 1D constitutive relationships. The model parameters are identified by iterative nonlinear optimization, minimizing the error between experimental and model-predicted structural (load-displacement) tissue response under a specific mode of deformation. We demonstrate the use of this methodology by determining the minimal rheological arrangement, constitutive relationships, and model parameters for the structural response of various soft tissues, including ex vivo perfused porcine liver in indentation, ex vivo porcine brain cortical tissue in indentation, and ex vivo human cervical tissue in unconfined compression. Our results indicate that the identified rheological configurations provide good agreement with experimental data, including multiple constant strain rate load/unload tests and stress relaxation tests. Our experience suggests that the described modeling framework is an efficient tool for exploring a wide array of constitutive relationships and rheological arrangements, which can subsequently serve as a basis for 3D constitutive model development and finite-element implementations. The proposed approach can also be employed as a self-contained tool to obtain simplified 1D phenomenological models of the structural response of biological tissue to single-axis manipulations for applications in haptic technologies.

  8. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  9. Impact of an electronic handoff documentation tool on team shared mental models in pediatric critical care.

    Science.gov (United States)

    Jiang, Silis Y; Murphy, Alexandrea; Heitkemper, Elizabeth M; Hum, R Stanley; Kaufman, David R; Mamykina, Lena

    2017-05-01

    To examine the impact of the implementation of an electronic handoff tool (the Handoff Tool) on shared mental models (SMM) within patient care teams as measured by content overlap and discrepancies in verbal handoff presentations given by different clinicians caring for the same patient. Researchers observed, recorded, and transcribed verbal handoffs given by different members of patient care teams in a pediatric intensive care unit. The transcripts were qualitatively coded and analyzed for content overlap scores and the number of discrepancies in handoffs of different team members before and after the implementation of the tool. Content overlap scores did not change post-implementation. The average number of discrepancies nearly doubled following the implementation (from 0.76 discrepancies per handoff group pre-implementation to 1.17 discrepancies per handoff group post-implementation); however, this change was not statistically significant (p=0.37). Discrepancies classified as related to dosage of treatment or procedure and to patients' symptoms increased in frequency post-implementation. The results suggest that the Handoff Tool did not have the desired positive impact on SMM within patient care teams. Future electronic tools for facilitating team handoff may need longer implementation times, complementary changes to handoff process and structure, and improved designs that integrate a common core of shared information with discipline-specific records. While electronic handoff tools provide great opportunities to improve communication and facilitate the formation of shared mental models within patient care teams, further work is necessary to realize their full potential. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  11. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    Science.gov (United States)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be

  12. Extending the 4I Organizational Learning Model: Information Sources, Foraging Processes and Tools

    Directory of Open Access Journals (Sweden)

    Tracy A. Jenkin

    2013-08-01

    Full Text Available The continued importance of organizational learning has recently led to several calls for further developing the theory. This article addresses these calls by extending Crossan, Lane and White’s (1999 4I model to include a fifth process, information foraging, and a fourth level, the tool. The resulting 5I organizational learning model can be generalized to a number of learning contexts, especially those that involve understanding and making sense of data and information. Given the need for organizations to both innovate and increase productivity, and the volumes of data and information that are available to support both, the 5I model addresses an important organizational issue.

  13. Sobol Sensitivity Analysis: A Tool to Guide the Development and Evaluation of Systems Pharmacology Models

    Science.gov (United States)

    Trame, MN; Lesko, LJ

    2015-01-01

    A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015 PMID:27548289

  14. A trade-off analysis design tool. Aircraft interior noise-motion/passenger satisfaction model

    Science.gov (United States)

    Jacobson, I. D.

    1977-01-01

    A design tool was developed to enhance aircraft passenger satisfaction. The effect of aircraft interior motion and noise on passenger comfort and satisfaction was modelled. Effects of individual aircraft noise sources were accounted for, and the impact of noise on passenger activities and noise levels to safeguard passenger hearing were investigated. The motion noise effect models provide a means for tradeoff analyses between noise and motion variables, and also provide a framework for optimizing noise reduction among noise sources. Data for the models were collected onboard commercial aircraft flights and specially scheduled tests.

  15. Catchment Models and Management Tools for diffuse Contaminants (Sediment, Phosphorus and Pesticides): DIFFUSE Project

    Science.gov (United States)

    Mockler, Eva; Reaney, Simeon; Mellander, Per-Erik; Wade, Andrew; Collins, Adrian; Arheimer, Berit; Bruen, Michael

    2017-04-01

    The agricultural sector is the most common suspected source of nutrient pollution in Irish rivers. However, it is also often the most difficult source to characterise due to its predominantly diffuse nature. Particulate phosphorus in surface water and dissolved phosphorus in groundwater are of particular concern in Irish water bodies. Hence the further development of models and indices to assess diffuse sources of contaminants are required for use by the Irish Environmental Protection Agency (EPA) to provide support for river basin planning. Understanding connectivity in the landscape is a vital component of characterising the source-pathway-receptor relationships for water-borne contaminants, and hence is a priority in this research. The DIFFUSE Project will focus on connectivity modelling and incorporation of connectivity into sediment, nutrient and pesticide risk mapping. The Irish approach to understanding and managing natural water bodies has developed substantially in recent years assisted by outputs from multiple research projects, including modelling and analysis tools developed during the Pathways and CatchmentTools projects. These include the Pollution Impact Potential (PIP) maps, which are an example of research output that is used by the EPA to support catchment management. The PIP maps integrate an understanding of the pollution pressures and mobilisation pathways and, using the source-pathways-receptor model, provide a scientific basis for evaluation of mitigation measures. These maps indicate the potential risk posed by nitrate and phosphate from diffuse agricultural sources to surface and groundwater receptors and delineate critical source areas (CSAs) as a means of facilitating the targeting of mitigation measures. Building on this previous research, the DIFFUSE Project will develop revised and new catchment managements tools focused on connectivity, sediment, phosphorus and pesticides. The DIFFUSE project will strive to identify the state

  16. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  17. Dynamic wind turbine models in power system simulation tool DIgSILENT

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, A.C.; Jauch, C.; Soerensen, P.; Iov, F.; Blaabjerg, F.

    2003-12-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). The initialisation issues on the wind turbine models into the power system simulation are also presented. However, the main attention in this report is drawn to the modelling at the system level of two wind turbine concepts: 1. Active stall wind turbine with induction generator 2. Variable speed, variable pitch wind turbine with doubly fed induction generator. These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies, connection of the wind turbine at different types of grid and storage systems. For both these two concepts, control strategies are developed and implemented, their performance assessed and discussed by means of simulations. (au)

  18. A tool for multi-scale modelling of the renal nephron

    Science.gov (United States)

    Nickerson, David P.; Terkildsen, Jonna R.; Hamilton, Kirk L.; Hunter, Peter J.

    2011-01-01

    We present the development of a tool, which provides users with the ability to visualize and interact with a comprehensive description of a multi-scale model of the renal nephron. A one-dimensional anatomical model of the nephron has been created and is used for visualization and modelling of tubule transport in various nephron anatomical segments. Mathematical models of nephron segments are embedded in the one-dimensional model. At the cellular level, these segment models use models encoded in CellML to describe cellular and subcellular transport kinetics. A web-based presentation environment has been developed that allows the user to visualize and navigate through the multi-scale nephron model, including simulation results, at the different spatial scales encompassed by the model description. The Zinc extension to Firefox is used to provide an interactive three-dimensional view of the tubule model and the native Firefox rendering of scalable vector graphics is used to present schematic diagrams for cellular and subcellular scale models. The model viewer is embedded in a web page that dynamically presents content based on user input. For example, when viewing the whole nephron model, the user might be presented with information on the various embedded segment models as they select them in the three-dimensional model view. Alternatively, the user chooses to focus the model viewer on a cellular model located in a particular nephron segment in order to view the various membrane transport proteins. Selecting a specific protein may then present the user with a description of the mathematical model governing the behaviour of that protein—including the mathematical model itself and various simulation experiments used to validate the model against the literature. PMID:22670210

  19. Testing and thermal modeling of radiant panels systems as commissioning tool

    International Nuclear Information System (INIS)

    Fonseca Diaz, Nestor; Cuevas, Cristian

    2010-01-01

    This paper presents the results of a study performed to develop a thermal modeling of radiant panels systems to be used in situ, as diagnosis tool in commissioning processes to determine the main operating conditions of the system in cooling or heating mode. The model considers the radiant panels as a finned heat exchanger in dry regime. By using as inputs the ceiling and room dimensions, the radiant ceiling material properties and the measurements of air and water mass flow rates and temperatures, the model is able to calculate the radiant ceiling capacity, ceiling surface average temperature, water exhaust temperature and resultant temperature as a comfort indicator. The modeling proposed considers combined convection, perforation effect and a detailed radiative heat exchange method for radiant ceiling systems. An example of each system considered in this study is shown, illustrating the validation of the model. A sensitive analysis of the model is performed.

  20. The Business Model Evaluation Tool for Smart Cities: Application to SmartSantander Use Cases

    Directory of Open Access Journals (Sweden)

    Raimundo Díaz-Díaz

    2017-02-01

    Full Text Available New technologies open up the door to multiple business models applied to public services in smart cities. However, there is not a commonly adopted methodology for evaluating business models in smart cities that can help both practitioners and researchers to choose the best option. This paper addresses this gap introducing the Business Model Evaluation Tool for Smart Cities. This methodology is a simple, organized, flexible and the transparent system that facilitates the work of the evaluators of potential business models. It is useful to compare two or more business models and take strategic decisions promptly. The method is part of a previous process of content analysis and it is based on the widely utilized Business Model Canvas. The evaluation method has been assessed by 11 experts and, subsequently it has been validated applying it to the case studies of Santander’s waste management and street lighting systems, which take advantage of innovative technologies commonly used in smart cities.

  1. A GUI-based Tool for Bridging the Gap between Models and Process-Oriented Studies

    Science.gov (United States)

    Kornfeld, A.; Van der Tol, C.; Berry, J. A.

    2014-12-01

    Models used for simulation of photosynthesis and transpiration by canopies of terrestrial plants typically have subroutines such as STOMATA.F90, PHOSIB.F90 or BIOCHEM.m that solve for photosynthesis and associated processes. Key parameters such as the Vmax for Rubisco and temperature response parameters are required by these subroutines. These are often taken from the literature or determined by separate analysis of gas exchange experiments. It is useful to note however that subroutines can be extracted and run as standalone models to simulate leaf responses collected in gas exchange experiments. Furthermore, there are excellent non-linear fitting tools that can be used to optimize the parameter values in these models to fit the observations. Ideally the Vmax fit in this way should be the same as that determined by a separate analysis, but it may not because of interactions with other kinetic constants and the temperature dependence of these in the full subroutine. We submit that it is more useful to fit the complete model to the calibration experiments rather as disaggregated constants. We designed a graphical user interface (GUI) based tool that uses gas exchange photosynthesis data to directly estimate model parameters in the SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes) model and, at the same time, allow researchers to change parameters interactively to visualize how variation in model parameters affect predicted outcomes such as photosynthetic rates, electron transport, and chlorophyll fluorescence. We have also ported some of this functionality to an Excel spreadsheet, which could be used as a teaching tool to help integrate process-oriented and model-oriented studies.

  2. SIMULATION MODELLING AS A TOOL FORPERFORMING AVAILABILIlYAND SENSITIVIlY ANALYSIS

    Directory of Open Access Journals (Sweden)

    P.S. Kruger

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Simulation modelling is a general purpose tool that may be used to provide decision support in a number of application areas. It may be used to analyze, design or "optimize" manufacturing, materials handling, management, commercial and a wide variety of other systems.
    This paper will report on the design of a prototype decision support tool, based on a simulation model of a vehicle fleet availability problem. The primary purpose of the model is to serve as a tool for the evaluation of the availability of equipment under different conditions and to perform sensitivity analysis.

    AFRIKAANSE OPSOMMING: Simulasiemodellering is 'n algemeendoelige tegniek wat gebruik kan word vir die verskaffing van besluitsteun in 'n aantal toepassingsgebiede. Dit mag gebruik word vir die analise, ontwerp of "optimisering" van vervaardiging-, materiaalhantering-, bestuur-, kommersieIe en 'n wye verskeidenheid ander stelsels.
    Hierdie referaat doen verslag oor die ontwikkeling van 'n prototipe besluitnemingshulpmiddel wat gebaseer is op 'n simulasiemodel van 'n voertuigvloot beskikbaarheidsprobleem. Die hoofdoelwit van die model is om te dien as 'n hulpmiddel by die evaluasie van die beskikbaarheid van toerusting onder verskillende omstandighede asook vir die uitvoer van sensitiwiteitsanalise.

  3. A software tool for modification of human voxel models used for application in radiation protection

    International Nuclear Information System (INIS)

    Becker, Janine; Zankl, Maria; Petoussi-Henss, Nina

    2007-01-01

    This note describes a new software tool called 'VolumeChange' that was developed to modify the masses and location of organs of virtual human voxel models. A voxel model is a three-dimensional representation of the human body in the form of an array of identification numbers that are arranged in slices, rows and columns. Each entry in this array represents a voxel; organs are represented by those voxels having the same identification number. With this tool, two human voxel models were adjusted to fit the reference organ masses of a male and a female adult, as defined by the International Commission on Radiological Protection (ICRP). The alteration of an already existing voxel model is a complicated process, leading to many problems that have to be solved. To solve those intricacies in an easy way, a new software tool was developed and is presented here. If the organs are modified, no bit of tissue, i.e. voxel, may vanish nor should an extra one appear. That means that organs cannot be modified without considering the neighbouring tissue. Thus, the principle of organ modification is based on the reassignment of voxels from one organ/tissue to another; actually deleting and adding voxels is only possible at the external surface, i.e. skin. In the software tool described here, the modifications are done by semi-automatic routines but including human control. Because of the complexity of the matter, a skilled person has to validate that the applied changes to organs are anatomically reasonable. A graphical user interface was designed to fulfil the purpose of a comfortable working process, and an adequate graphical display of the modified voxel model was developed. Single organs, organ complexes and even whole limbs can be edited with respect to volume, shape and location. (note)

  4. A Tool for Model-Based Generation of Scenario-driven Electric Power Load Profiles

    Science.gov (United States)

    Rozek, Matthew L.; Donahue, Kenneth M.; Ingham, Michel D.; Kaderka, Justin D.

    2015-01-01

    Power consumption during all phases of spacecraft flight is of great interest to the aerospace community. As a result, significant analysis effort is exerted to understand the rates of electrical energy generation and consumption under many operational scenarios of the system. Previously, no standard tool existed for creating and maintaining a power equipment list (PEL) of spacecraft components that consume power, and no standard tool existed for generating power load profiles based on this PEL information during mission design phases. This paper presents the Scenario Power Load Analysis Tool (SPLAT) as a model-based systems engineering tool aiming to solve those problems. SPLAT is a plugin for MagicDraw (No Magic, Inc.) that aids in creating and maintaining a PEL, and also generates a power and temporal variable constraint set, in Maple language syntax, based on specified operational scenarios. The constraint set can be solved in Maple to show electric load profiles (i.e. power consumption from loads over time). SPLAT creates these load profiles from three modeled inputs: 1) a list of system components and their respective power modes, 2) a decomposition hierarchy of the system into these components, and 3) the specification of at least one scenario, which consists of temporal constraints on component power modes. In order to demonstrate how this information is represented in a system model, a notional example of a spacecraft planetary flyby is introduced. This example is also used to explain the overall functionality of SPLAT, and how this is used to generate electric power load profiles. Lastly, a cursory review of the usage of SPLAT on the Cold Atom Laboratory project is presented to show how the tool was used in an actual space hardware design application.

  5. Interdisciplinary semantic model for managing the design of a steam-assisted gravity drainage tooling system

    Directory of Open Access Journals (Sweden)

    Michael Leitch

    2018-01-01

    Full Text Available Complex engineering systems often require extensive coordination between different expert areas in order to avoid costly design iterations and rework. Cyber-physics system (CPS engineering methods could provide valuable insights to help model these interactions and optimize the design of such systems. In this work, steam assisted gravity drainage (SAGD, a complex oil extraction process that requires deep understanding of several physical-chemical phenomena, is examined whereby the complexities and interdependencies of the system are explored. Based on an established unified feature modeling scheme, a software modeling framework is proposed to manage the design process of the production tools used for SAGD oil extraction. Applying CPS methods to unify complex phenomenon and engineering models, the proposed CPS model combines effective simulation with embedded knowledge of completion tooling design in order to optimize reservoir performance. The system design is expressed using graphical diagrams of the unified modelling language (UML convention. To demonstrate the capability of this system, a distributed research group is described, and their activities coordinated using the described CPS model.

  6. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  7. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Science.gov (United States)

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). © 2013 Elsevier B.V. All rights reserved.

  8. KENO3D visualization tool for KENO V.a geometry models

    International Nuclear Information System (INIS)

    Bowman, S.M.; Horwedel, J.E.

    1999-01-01

    The standardized computer analyses for licensing evaluations (SCALE) computer software system developed at Oak Ridge National Laboratory (ORNL) is widely used and accepted around the world for criticality safety analyses. SCALE includes the well-known KENO V.a three-dimensional Monte Carlo criticality computer code. Criticality safety analysis often require detailed modeling of complex geometries. Checking the accuracy of these models can be enhanced by effective visualization tools. To address this need, ORNL has recently developed a powerful state-of-the-art visualization tool called KENO3D that enables KENO V.a users to interactively display their three-dimensional geometry models. The interactive options include the following: (1) having shaded or wireframe images; (2) showing standard views, such as top view, side view, front view, and isometric three-dimensional view; (3) rotating the model; (4) zooming in on selected locations; (5) selecting parts of the model to display; (6) editing colors and displaying legends; (7) displaying properties of any unit in the model; (8) creating cutaway views; (9) removing units from the model; and (10) printing image or saving image to common graphics formats

  9. CSML2SBML: a novel tool for converting quantitative biological pathway models from CSML into SBML.

    Science.gov (United States)

    Li, Chen; Nagasaki, Masao; Ikeda, Emi; Sekiya, Yayoi; Miyano, Satoru

    2014-07-01

    CSML and SBML are XML-based model definition standards which are developed with the aim of creating exchange formats for modeling, visualizing and simulating biological pathways. In this article we report a release of a format convertor for quantitative pathway models, namely CSML2SBML. It translates models encoded by CSML into SBML without loss of structural and kinetic information. The simulation and parameter estimation of the resulting SBML model can be carried out with compliant tool CellDesigner for further analysis. The convertor is based on the standards CSML version 3.0 and SBML Level 2 Version 4. In our experiments, 11 out of 15 pathway models in CSML model repository and 228 models in Macrophage Pathway Knowledgebase (MACPAK) are successfully converted to SBML models. The consistency of the resulting model is validated by libSBML Consistency Check of CellDesigner. Furthermore, the converted SBML model assigned with the kinetic parameters translated from CSML model can reproduce the same dynamics with CellDesigner as CSML one running on Cell Illustrator. CSML2SBML, along with its instructions and examples for use are available at http://csml2sbml.csml.org. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    Science.gov (United States)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  11. A modeling tool to support decision making in future hydropower development in Chile

    Science.gov (United States)

    Vicuna, S.; Hermansen, C.; Cerda, J. P.; Olivares, M. A.; Gomez, T. I.; Toha, E.; Poblete, D.; Mao, L.; Falvey, M. J.; Pliscoff, P.; Melo, O.; Lacy, S.; Peredo, M.; Marquet, P. A.; Maturana, J.; Gironas, J. A.

    2017-12-01

    Modeling tools support planning by providing transparent means to assess the outcome of natural resources management alternatives within technical frameworks in the presence of conflicting objectives. Such tools, when employed to model different scenarios, complement discussion in a policy-making context. Examples of practical use of this type of tool exist, such as the Canadian public forest management, but are not common, especially in the context of developing countries. We present a tool to support the selection from a portfolio of potential future hydropower projects in Chile. This tool, developed by a large team of researchers under the guidance of the Chilean Energy Ministry, is especially relevant in the context of evident regionalism, skepticism and change in societal values in a country that has achieved a sustained growth alongside increased demands from society. The tool operates at a scale of a river reach, between 1-5 km long, on a domain that can be defined according to the scale needs of the related discussion, and its application can vary from river basins to regions or other spatial configurations that may be of interest. The tool addresses both available hydropower potential and the existence (inferred or observed) of other ecological, social, cultural and productive characteristics of the territory which are valuable to society, and provides a means to evaluate their interaction. The occurrence of each of these other valuable characteristics in the territory is measured by generating a presence-density score for each. Considering the level of constraint each characteristic imposes on hydropower development, they are weighted against each other and an aggregate score is computed. With this information, optimal trade-offs are computed between additional hydropower capacity and valuable local characteristics over the entire domain, using the classical knapsack 0-1 optimization algorithm. Various scenarios of different weightings and hydropower

  12. The Climate-Agriculture-Modeling and Decision Tool (CAMDT) for Climate Risk Management in Agriculture

    Science.gov (United States)

    Ines, A. V. M.; Han, E.; Baethgen, W.

    2017-12-01

    Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT

  13. Modeling tools for the assessment of microbiological risks during floods: a review

    Science.gov (United States)

    Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin

    2015-04-01

    Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing

  14. Scale models: A proven cost-effective tool for outage planning

    Energy Technology Data Exchange (ETDEWEB)

    Lee, R. [Commonwealth Edison Co., Morris, IL (United States); Segroves, R. [Sargent & Lundy, Chicago, IL (United States)

    1995-03-01

    As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning and monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.

  15. Surface Modeling of Workpiece and Tool Trajectory Planning for Spray Painting Robot

    Science.gov (United States)

    Tang, Yang; Chen, Wei

    2015-01-01

    Automated tool trajectory planning for spray-painting robots is still a challenging problem, especially for a large free-form surface. A grid approximation of a free-form surface is adopted in CAD modeling in this paper. A free-form surface model is approximated by a set of flat patches. We describe here an efficient and flexible tool trajectory optimization scheme using T-Bézier curves calculated in a new way from trigonometrical bases. The distance between the spray gun and the free-form surface along the normal vector is varied. Automotive body parts, which are large free-form surfaces, are used to test the scheme. The experimental results show that the trajectory planning algorithm achieves satisfactory performance. This algorithm can also be extended to other applications. PMID:25993663

  16. Transformation of Baumgarten's aesthetics into a tool for analysing works and for modelling

    DEFF Research Database (Denmark)

    Thomsen, Bente Dahl

    2006-01-01

      Abstract: Is this the best form, or does it need further work? The aesthetic object does not possess the perfect qualities; but how do I proceed with the form? These are questions that all modellers ask themselves at some point, and with which they can grapple for days - even weeks - before...... the inspiration to deliver the form finally presents itself. This was the outlet for our plan to devise a tool for analysing works and the practical development of forms. The tool is a set of cards with suggestions for investigations that may assist the modeller in identifying the weaknesses of the form......, or convince him-/herself about its strengths. The cards also contain aesthetical reflections that may be of inspiration in the development of the form....

  17. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    Science.gov (United States)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  18. Finite Element Modelling of the effect of tool rake angle on tool temperature and cutting force during high speed machining of AISI 4340 steel

    Science.gov (United States)

    Sulaiman, S.; Roshan, A.; Ariffin, M. K. A.

    2013-12-01

    In this paper, a Finite Element Method (FEM) based on the ABAQUS explicit software which involves Johnson-Cook material model was used to simulate cutting force and tool temperature during high speed machining (HSM) of AISI 4340 steel. In this simulation work, a tool rake angle ranging from 0° to 20° and a range of cutting speeds between 300 to 550 m/min was investigated. The purpose of this simulation analysis was to find optimum tool rake angle where cutting force is smallest as well as tool temperature is lowest during high speed machining. It was found that cutting forces to have a decreasing trend as rake angle increased to positive direction. The optimum rake angle observed between 10° and 18° due to decrease of cutting force as 20% for all simulated cutting speeds. In addition, increasing cutting tool rake angle over its optimum value had negative influence on tool's performance and led to an increase in cutting temperature. The results give a better understanding and recognition of the cutting tool design for high speed machining processes.

  19. Reducing the operational energy demand in buildings using building information modeling tools and sustainability approaches

    OpenAIRE

    Shoubi, Mojtaba Valinejad; Shoubi, Masoud Valinejad; Bagchi, Ashutosh; Barough, Azin Shakiba

    2015-01-01

    A sustainable building is constructed of materials that could decrease environmental impacts, such as energy usage, during the lifecycle of the building. Building Information Modeling (BIM) has been identified as an effective tool for building performance analysis virtually in the design stage. The main aims of this study were to assess various combinations of materials using BIM and identify alternative, sustainable solutions to reduce operational energy consumption. The amount of energy con...

  20. Graphical surface-vegetation-atmosphere transfer (SVAT) model as a pedagogical and research tool

    OpenAIRE

    Gillies, Robert R.; Carlson, Toby N.; Ripley, David A.J.

    1998-01-01

    This paper considers, by example, the use of a Surface-Atmosphere-Vegetation-Transfer (SVAT), Atmospheric Boundary Layer (ABL) model designed as a pedagogical tool. The goal of the computer software and the approach is to improve the efficiency and effectiveness of communicating often complex and mathematical based disciplines (e.g., micrometeorology, land surface processes) to the non-specialist interested in studying problems involving interactions between vegetation and the atmosphere and,...

  1. Army Sustainability Modelling Analysis and Reporting Tool Phase 1: User Manual and Results Interpretation Guide

    Science.gov (United States)

    2009-11-01

    Force Sustainability Modelling Tool Prototype GB Gigabyte GRES General Reserve HQ Headquarters HTA Hardening the Army JOLTS Joint Operational...Hardening the Army ( HTA ) proposed force structure.1 Following this work, the Director General Preparedness and Plans – Army (DGPP-A) approached DSTO to...that the different elements of the results for the corps have been identified, we can turn our attention to what the results say about the

  2. Teaching Integrated Scope-Cost Methods with Model-based Tools

    OpenAIRE

    Peterson, Forest; Fischer, Martin; Wingate, Thomas; Seppänen, Olli; Tutti, Tomi; See, Richard

    2009-01-01

    The purpose of this paper is to outline teaching integrated scope-cost methods in a course on fabrication and construction planning using model-based tools. Through project-based active discovery using project documents students create an integrated takeoff, schedule and cost estimate. The goal is to illustrate the processes and interrelation between professions required to effectively obtain the scope, schedule and cost of a proposed project. Students who are provided with a scope-time-cost ...

  3. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  4. Towards a Tool-Supported Quality Model for Model-Driven Engineering

    OpenAIRE

    Mohagheghi, Parastoo

    2008-01-01

    This paper reviews definitions of model quality before introducing five properties of models that are important for building high-quality models. These are identified to be correctness, completeness, consistency, comprehensibility and confinement. We have earlier defined a quality model that separates intangible quality goals from tangible quality-carrying properties and practices that should be in place to support these properties.  A part of that work was to define a metamodel for deve...

  5. using explanatory models to derive simple tools for Avanced Life Support system studies - Crop Modelling

    Science.gov (United States)

    Cavazzoni, J.

    System-level analyses for Advanced Life Support (ALS) require mathematical models for various processes, such as biomass production and waste management, which would ideally be integrated into overall system models. Explanatory models (also referred to as mechanistic or process models) would provide the basis for a more robust system model, as these would be based on an understanding of processes specific to ALS studies. However, integrating such models may not always be practicable because of their complexity, especially for initial system-level analyses where simple sub-models may be satisfactory. One way to address this is to capture important features of explanatory models in simple models that may be readily integrated for system-level analyses. In this paper, explanatory crop models were used to generate parameters and multi-variable polynomial equations for basic models that are suitable for estimating the direction and magnitude of daily changes in canopy gas-exchange, harvest index, and production scheduling due to off- nominal conditions for ALS system studies. The simplest variant of these models consists of only a few equations, and has been integrated into a top-level SIMULINK model for the Bioregenerative Planetary Life Support Systems Test Complex (BIO-Plex), a large-scale human-rated test facility under development at NASA Johnson Space Center. When included in systems studies, the simple crop models may help identify issues that need to be addressed using more detailed modeling studies and specific experiments. Similar modeling simplifications may also prove useful for other ALS sub-systems, as well as for Earth system applications.

  6. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    Science.gov (United States)

    Gupta, Saurabh; Black-Schaffer, W. Stephen; Crawford, James M.; Gross, David; Karcher, Donald S.; Kaufman, Jill; Knapman, Doug; Prystowsky, Michael B.; Wheeler, Thomas M.; Bean, Sarah; Kumar, Paramhans; Sharma, Raghav; Chamoli, Vaibhav; Ghai, Vikrant; Gogia, Vineet; Weintraub, Sally; Cohen, Michael B.

    2015-01-01

    Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply) of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories), service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models. PMID:28725751

  7. An Innovative Interactive Modeling Tool to Analyze Scenario-Based Physician Workforce Supply and Demand

    Directory of Open Access Journals (Sweden)

    Saurabh Gupta BPharm

    2015-10-01

    Full Text Available Effective physician workforce management requires that the various organizations comprising the House of Medicine be able to assess their current and future workforce supply. This information has direct relevance to funding of graduate medical education. We describe a dynamic modeling tool that examines how individual factors and practice variables can be used to measure and forecast the supply and demand for existing and new physician services. The system we describe, while built to analyze the pathologist workforce, is sufficiently broad and robust for use in any medical specialty. Our design provides a computer-based software model populated with data from surveys and best estimates by specialty experts about current and new activities in the scope of practice. The model describes the steps needed and data required for analysis of supply and demand. Our modeling tool allows educators and policy makers, in addition to physician specialty organizations, to assess how various factors may affect demand (and supply of current and emerging services. Examples of factors evaluated include types of professional services (3 categories with 16 subcategories, service locations, elements related to the Patient Protection and Affordable Care Act, new technologies, aging population, and changing roles in capitated, value-based, and team-based systems of care. The model also helps identify where physicians in a given specialty will likely need to assume new roles, develop new expertise, and become more efficient in practice to accommodate new value-based payment models.

  8. Mathematical Modeling: A Tool for Optimization of Lipid Nanoparticle-Mediated Delivery of siRNA.

    Science.gov (United States)

    Mihaila, Radu; Ruhela, Dipali; Keough, Edward; Cherkaev, Elena; Chang, Silvia; Galinski, Beverly; Bartz, René; Brown, Duncan; Howell, Bonnie; Cunningham, James J

    2017-06-16

    Lipid nanoparticles (LNPs) have been used to successfully deliver small interfering RNAs (siRNAs) to target cells in both preclinical and clinical studies and currently are the leading systems for in vivo delivery. Here, we propose the use of an ordinary differential equation (ODE)-based model as a tool for optimizing LNP-mediated delivery of siRNAs. As a first step, we have used a combination of experimental and computational approaches to develop and validate a mathematical model that captures the critical features for efficient siRNA-LNP delivery in vitro. This model accurately predicts mRNA knockdown resulting from novel combinations of siRNAs and LNPs in vitro. As demonstrated, this model can be effectively used as a screening tool to select the most efficacious LNPs, which can then further be evaluated in vivo. The model serves as a starting point for the future development of next generation models capable of capturing the additional complexity of in vivo delivery. Copyright © 2017 Elena Cherkaev, Merck Sharp & Dohme Corp., a subsidiary of Merck & Co., Inc., Kenilworth, NJ USA. Published by Elsevier Inc. All rights reserved.

  9. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    Directory of Open Access Journals (Sweden)

    Thomas E Gorochowski

    Full Text Available Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher.

  10. The mesoscale dispersion modeling system a simulation tool for development of an emergency response system

    International Nuclear Information System (INIS)

    Uliasz, M.

    1990-01-01

    The mesoscale dispersion modeling system is under continuous development. The included numerical models require further improvements and evaluation against data from meteorological and tracer field experiments. The system can not be directly applied to real time predictions. However, it seems to be a useful simulation tool for solving several problems related to planning the monitoring network and development of the emergency response system for the nuclear power plant located in a coastal area. The modeling system can be also applied to another environmental problems connected with air pollution dispersion in complex terrain. The presented numerical models are designed for the use on personal computers and are relatively fast in comparison with the similar mesoscale models developed on mainframe computers

  11. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  12. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  13. A remote sensing computer-assisted learning tool developed using the unified modeling language

    Science.gov (United States)

    Friedrich, J.; Karslioglu, M. O.

    The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.

  14. A sensitivity driven meta-model optimisation tool for hydrological models

    Science.gov (United States)

    Oppel, Henning; Schumann, Andreas

    2017-04-01

    The calibration of rainfall-runoff-models containing a high number of parameters can be done readily by the use of different calibration methods and algorithms. Monte-Carlo Methods, gradient based search algorithms and others are well-known and established in hydrological sciences. Thus, the calibration of a model for a desired application is not a challenging task, but retaining regional comparability and process integrity is, due to the equifinality-problem, a prevailing topic. This set of issues is mainly a result of the overdeterminaton given the high number of parameters in rainfall-runoff-models, where different parameters are affecting the same facet of model performance (i.e. runoff volume, variance and timing). In this study a calibration strategy is presented which considers model sensitivity as well as parameter interaction and different criteria of model performance. At first a range of valid values for each model parameter was defined and the individual effect on model performance within the defined parameter range was evaluated. By use of the gained knowledge a meta-model, lumping different parameters affecting the same facet of model performance, was established. Hereafter, the parsimonious meta-model, where each parameter is assigned to a nearly disjoint facet of model performance is optimized. By retransformation of the lumped parameters to the original model, a parametrisation for the original model is obtained. An application of this routine to a set of watersheds in the eastern part of Germany displays the benefits of the routine. Results of the meta-parametrised model are compared to parametrisations obtained from common calibration routines in a validation study and process orientated numerical experiment.

  15. The role of measurement and modelling of machine tools in improving product quality

    Directory of Open Access Journals (Sweden)

    Longstaff A.P.

    2013-01-01

    Full Text Available Manufacturing of high-quality components and assemblies is clearly recognised by industrialised nations as an important means of wealth generation. A “right first time” paradigm to producing finished components is the desirable goal to maximise economic benefits and reduce environmental impact. Such an ambition is only achievable through an accurate model of the machinery used to shape the finished article. In the first analysis, computer aided design (CAD and computer aided manufacturing (CAM can be used to produce an instruction list of three-dimensional coordinates and intervening tool paths to translate the intent of a design engineer into an unambiguous set of commands for a manufacturing machine. However, in order for the resultant manufacturing program to produce the desired output within the specified tolerance, the model of the machine has to be sufficiently accurate. In this paper, the spatial and temporal sources of error and various contemporary means of modelling are discussed. Limitations and assumptions in the models are highlighted and an estimate of their impact is made. Measurement of machine tools plays a vital role in establishing the accuracy of a particular machine and calibrating its unique model, but is an often misunderstood and misapplied discipline. Typically, the individual errors of the machine will be quantified at a given moment in time, but without sufficient consideration either for the uncertainty of individual measurements or a full appreciation of the complex interaction between each independently measured error. This paper draws on the concept of a “conformance zone”, as specified in the ISO 230:1 – 2012, to emphasise the need for a fuller understanding of the complex uncertainty of measurement model for a machine tool. Work towards closing the gap in this understanding is described and limitations are noted.

  16. MoManI: a tool to facilitate research, analysis, and teaching of computer models

    Science.gov (United States)

    Howells, Mark; Pelakauskas, Martynas; Almulla, Youssef; Tkaczyk, Alan H.; Zepeda, Eduardo

    2017-04-01

    Allocating limited resource efficiently is a task to which efficient planning and policy design aspires. This may be a non-trivial task. For example, the seventh sustainable development goal (SDG) of Agenda 2030 is to provide access to affordable sustainable energy to all. On the one hand, energy is required to realise almost all other SDGs. (A clinic requires electricity for fridges to store vaccines for maternal health, irrigate agriculture requires energy to pump water to crops in dry periods etc.) On the other hand, the energy system is non-trivial. It requires the mapping of resource, its conversion into useable energy and then into machines that we use to meet our needs. That requires new tools that draw from standard techniques, best-in-class models and allow the analyst to develop new models. Thus we present the Model Management Infrastructure (MoManI). MoManI is used to develop, manage, run, store input and results data for linear programming models. MoManI, is a browser-based open source interface for systems modelling. It is available to various user audiences, from policy makers and planners through to academics. For example, we implement the Open Source energy Modelling System (OSeMOSYS) in MoManI. OSeMOSYS is a specialized energy model generator. A typical OSeMOSYS model would represent the current energy system of a country, region or city; in it, equations and constraints are specified; and calibrated to a base year. From that future technologies and policy options are represented. From those scenarios are designed and run. Efficient allocation of energy resource and expenditure on technology is calculated. Finally, results are visualized. At present this is done in relatively rigid interfaces or via (for some) cumbersome text files. Implementing and operating OSeMOSYS in MoManI shortens the learning curve and reduces phobia associated with the complexity of computer modelling, thereby supporting effective capacity building activities. The novel

  17. Software tools for 3d modeling as a part of design and technology in primary school

    OpenAIRE

    Mihovec, Nastja

    2013-01-01

    There are numerous programs that enable 3D modeling. We can choose from various free programs or the ones that we must pay for. Many designers and engineers use payable programs such as AutoCad, Maya, ProEngineer, Cinema 3D, SolidWorks, etc. In their opinion these programs give their users more than the free ones mainly because of their better modeling quality, tools, functions, easy usage, support, maintenance, etc. Free program developers try very hard to convince these users to reconsider,...

  18. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.

  19. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  20. Translating statistical species-habitat models to interactive decision support tools.

    Directory of Open Access Journals (Sweden)

    Lyndsie S Wszola

    Full Text Available Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.

  1. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    Science.gov (United States)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision

  2. NOTATION TOOLS OF BUSINESS MODELING OF THE SERVICES ON REAL ESTATE MARKET

    Directory of Open Access Journals (Sweden)

    Mishlanova Marina Yur’evna

    2016-04-01

    Full Text Available The article is devoted to the development of the main provisions of realtor business modeling. In the paper the development of notational complex is presented, which is involved in the design of the conceptual model, the formation of a reference model of real estate business and basic rules for the implementation of the model. In the construction of the proposed model important notational aspects are highlighted. Functional orientation of real estate business for rendering services reflects a functional approach to business modeling. In order to ensure the assessment of the offered services it is proposed to implement a nested model of the object. A reasonable functional approach using object-based elements allows optimizing the processes of business modeling and assessment of the results. The article discusses functional modeling of business, focusing on the results. Synchronizing the functional model with the models of business processes and sub-models of objects, in particular, the model of business result, contributes to the improvement of the notations tools. The article presents the adaptation of the template of the business model to the conditions of the realtor activity. The proposed reference model specifies the logical scheme of decomposition activity, which detaches economic, social and other values. The decomposition of services into functional groups with account for individual values and functional modules is presented: buying and selling real estate; mortgages and loans; rent of residential and commercial property; an independent evaluation of real estate; consultations concerning the issues of real estate transactions. In the focus of the results of business processes and performance standards of realtor organizations transitional notation to the evaluation system efficiency of business performance is developed. The simplest method of feedback for assessing customer satisfaction and, consequently, system efficiency is offered

  3. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  4. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    International Nuclear Information System (INIS)

    Staub, Florian; Athron, Peter; Basso, Lorenzo; Goodsell, Mark D.; Harries, Dylan; Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby; Ubaldi, Lorenzo; Vicente, Avelino; Voigt, Alexander

    2016-01-01

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model. (orig.)

  5. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Geneva (Switzerland). Theoretical Physics Dept.; Athron, Peter [Monash Univ., Melbourne (Australia). ARC Center of Excellence for Particle Physics at the Terascale; Basso, Lorenzo [Aix-Marseille Univ., CNRS-IN2P3, UMR 7346 (France). CPPM; and others

    2016-02-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  6. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  7. MESSI: An engineering tool for conceptual hydrological modeling using SUPERFLEX, MOSCEM and GLUE

    Science.gov (United States)

    van Osnabrugge, Bart; Mondeel, Herman; Hrachowitz, Markus

    2014-05-01

    The progress of hydrology as a science is mentioned quite often and indeed lots of theoretical research is done to improving hydrological rainfall-runoff (RR) modeling. At the same time however, it is concluded that engineering practice lags behind on this scientific progress by at least a couple of years. In this research, it is investigated how this gap can be closed. An engineering tool is developed called Model Ensemble, Sampling, Selection and Interpretation (MESSI) and tested in the engineering environment. The tool uses the model hypothesis framework SUPERFLEX to build an 'a-priori' ensemble of possible model structures for the case at hand. Then, the Multi-objective Shuffled Complex Evolution Metropolis algorithm (MOSCEM) is used for sampling of the parameter space. Finally, the Generalized Likelihood Uncertainty Estimation (GLUE) methodology is used to select a posterior ensemble which is then interpreted using the Pareto front and generated uncertainty bounds. During the trial it was found that MESSI provides a plug-and-play method which is able to provide catchment process information, a mathematical optimal model and a measure of uncertainty based on the observation. Most important, it is shown that with a little effort new techniques can be brought directly to the engineering arena which will improve the interaction between the scientist and the engineer.

  8. SModelS: A Tool for Making Systematic Use of Simplified Models Results

    Science.gov (United States)

    Waltenberger, Wolfgang; SModelS Group

    2016-10-01

    We present an automated software tool ”SModelS” to systematically confront theories Beyond the Standard Model (BSM) with experimental data. The tool consists of a general procedure to decompose such BSM theories into their Simplified Models Spectra (SMS). In addition, SModelS features a database containing the majority of the published SMS results of CMS and ATLAS. These results consist of the 95% confidence level upper limits on signal production cross sections. The two components together allow us to quickly confront any BSM model with LHC results. As a show-case example we will briefly discuss an application of our procedure to a specific supersymmetric model. It is one of our ongoing efforts to extend the framework to include also efficiency maps produced either by the experimental collaborations, by efforts performed within the phenomenological groups, or possibly also by ourselves. While the current implementation can handle null results only, it is our ultimate goal to build the Next Standard Model in a bottom-up fashion from both negative and positive results of several experiments. The implementation is open source, written in python, and available from http://smodels.hephy.at.

  9. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  10. ARCHITECTURAL FORM CREATION IN THE DESIGN STUDIO: PHYSICAL MODELING AS AN EFFECTIVE DESIGN TOOL

    Directory of Open Access Journals (Sweden)

    Wael Abdelhameed

    2011-11-01

    Full Text Available This research paper attempts to shed more light on an area of the design studio, which concerns with the use of physical modeling as a design medium in architectural form creation. An experiment has been carried out during an architectural design studio in order to not only investigate physical modeling as a tool of form creation but also improve visual design thinking that students employ while using this manual tool. To achieve the research objective, a method was proposed and applied to track form creation processes, based upon three types of operation, namely: sketching transformations, divergent physical-modeling transformations, and convergent physical-modeling transformations. The method helps record the innovative transitions of form during conceptual designing in a simple way. Investigating form creation processes and activities associated with visual design thinking enables the research to conclude to general results of the role of physical modeling in the conceptual phase of designing, and to specific results of the methods used in this architectural design studio experiment.

  11. Artificial neural networks: an efficient tool for modelling and optimization of biofuel production (a mini review)

    International Nuclear Information System (INIS)

    Sewsynker-Sukai, Yeshona; Faloye, Funmilayo; Kana, Evariste Bosco Gueguim

    2016-01-01

    In view of the looming energy crisis as a result of depleting fossil fuel resources and environmental concerns from greenhouse gas emissions, the need for sustainable energy sources has secured global attention. Research is currently focused towards renewable sources of energy due to their availability and environmental friendliness. Biofuel production like other bioprocesses is controlled by several process parameters including pH, temperature and substrate concentration; however, the improvement of biofuel production requires a robust process model that accurately relates the effect of input variables to the process output. Artificial neural networks (ANNs) have emerged as a tool for modelling complex, non-linear processes. ANNs are applied in the prediction of various processes; they are useful for virtual experimentations and can potentially enhance bioprocess research and development. In this study, recent findings on the application of ANN for the modelling and optimization of biohydrogen, biogas, biodiesel, microbial fuel cell technology and bioethanol are reviewed. In addition, comparative studies on the modelling efficiency of ANN and other techniques such as the response surface methodology are briefly discussed. The review highlights the efficiency of ANNs as a modelling and optimization tool in biofuel process development

  12. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Theoretical Physics Department, Geneva (Switzerland); Athron, Peter [Monash University, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Melbourne, VIC (Australia); Basso, Lorenzo [CPPM, Aix-Marseille Universite, CNRS-IN2P3, UMR 7346, Marseille Cedex 9 (France); Goodsell, Mark D. [Sorbonne Universites, LPTHE, UMR 7589, CNRS and Universite Pierre et Marie Curie, Paris Cedex 05 (France); Harries, Dylan [The University of Adelaide, Department of Physics, ARC Centre of Excellence for Particle Physics at the Terascale, Adelaide, SA (Australia); Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby [Bethe Center for Theoretical Physics and Physikalisches Institut der Universitaet Bonn, Bonn (Germany); Ubaldi, Lorenzo [Tel-Aviv University, Raymond and Beverly Sackler School of Physics and Astronomy, Tel Aviv (Israel); Vicente, Avelino [Instituto de Fisica Corpuscular (CSIC-Universitat de Valencia), Valencia (Spain); Voigt, Alexander [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2016-09-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model. (orig.)

  13. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    International Nuclear Information System (INIS)

    Staub, Florian; Athron, Peter; Basso, Lorenzo

    2016-02-01

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  14. Regulatory odour model development: Survey of modelling tools and datasets with focus on building effects

    DEFF Research Database (Denmark)

    Olesen, H. R.; Løfstrøm, P.; Berkowicz, R.

    A project within the framework of a larger research programme, Action Plan for the Aquatic Environment III (VMP III) aims towards improving an atmospheric dispersion model (OML). The OML model is used for regulatory applications in Denmark, and it is the candidate model to be used also in future...... in relation to odour problems due to animal farming. However, the model needs certain improvements and validation in order to be fully suited for that purpose. The report represents a survey of existing literature, models and data sets. It includes a brief overview of the state-of-the-art of atmospheric...... dispersion models for estimating local concentration levels in general. However, the report focuses on some particular issues, which are relevant for subsequent work on odour due to animal production. An issue of primary concern is the effect that buildings (stables) have on flow and dispersion. The handling...

  15. DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.

    Directory of Open Access Journals (Sweden)

    Stefan K Lhachimi

    Full Text Available BACKGROUND: Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA. A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states and three usability criteria (modest data requirements, rich model output, generally accessible to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment, we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. METHODS AND RESULTS: DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. CONCLUSION: By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based

  16. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    Science.gov (United States)

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  17. Numerical modelling of tools steel hardening. A thermal phenomena and phase transformations

    Directory of Open Access Journals (Sweden)

    T. Domański

    2010-01-01

    Full Text Available This paper the model hardening of tool steel takes into considerations of thermal phenomena and phase transformations in the solid state are presented. In the modelling of thermal phenomena the heat equations transfer has been solved by Finite Elements Method. The graph of continuous heating (CHT and continuous cooling (CCT considered steel are used in the model of phase transformations. Phase altered fractions during the continuous heating austenite and continuous cooling pearlite or bainite are marked in the model by formula Johnson-Mehl and Avrami. For rate of heating >100 K/s the modified equation Koistinen and Marburger is used. Modified equation Koistinen and Marburger identify the forming fraction of martensite.

  18. A model of integration among prediction tools: applied study to road freight transportation

    Directory of Open Access Journals (Sweden)

    Henrique Dias Blois

    Full Text Available Abstract This study has developed a scenery analysis model which has integrated decision-making tools on investments: prospective scenarios (Grumbach Method and systems dynamics (hard modeling, with the innovated multivariate analysis of experts. It was designed through analysis and simulation scenarios and showed which are the most striking events in the study object as well as highlighted the actions could redirect the future of the analyzed system. Moreover, predictions are likely to be developed through the generated scenarios. The model has been validated empirically with road freight transport data from state of Rio Grande do Sul, Brazil. The results showed that the model contributes to the analysis of investment because it identifies probabilities of events that impact on decision making, and identifies priorities for action, reducing uncertainties in the future. Moreover, it allows an interdisciplinary discussion that correlates different areas of knowledge, fundamental when you wish more consistency in creating scenarios.

  19. Mathematical modelling of migration: A suitable tool for the enforcement authorities?

    DEFF Research Database (Denmark)

    Petersen, Jens Højslev; Trier, Xenia Thorsager; Fabech, B.

    2005-01-01

    possibilities of implementing migration-modelling software as a tool in official food control and possibly in improving the own-check programmes of Danish plastic-converting plants. Food inspectors from nine regional food control centres initially attended a training course in the use of a commercial modelling...... reason was a lack of information from those in the raw material supply chain who considered their products protected by commercial confidentiality. In general, the food inspectors were in favour of using migration modelling for future control visits.......A few years ago, it became accepted that the plastics industry could use migration modelling for compliance testing. When a calculation confirms that the migration of a compound from a plastic material or article is below the specific migration limit, this is considered sufficient documentation...

  20. An improved model for the oPtImal Measurement Probes Allocation tool

    International Nuclear Information System (INIS)

    Sterle, C.; Neto, A.C.; De Tommasi, G.

    2015-01-01

    Highlights: • The problem of optimally allocating the probes of a diagnostic system is tackled. • The problem is decomposed in two consecutive optimization problems. • Two original ILP models are proposed and sequentially solved to optimality. • The proposed ILP models improve and extend the previous work present in literature. • Real size instances have been optimally solved with very low computation time. - Abstract: The oPtImal Measurement Probes Allocation (PIMPA) tool has been recently proposed in [1] to maximize the reliability of a tokamak diagnostic system against the failure of one or more of the processing nodes. PIMPA is based on the solution of integer linear programming (ILP) problems, and it minimizes the effect of the failure of a data acquisition component. The first formulation of the PIMPA model did not support the concept of individual slots. This work presents an improved ILP model that addresses the above mentioned problem, by taking into account all the individual probes.

  1. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    Science.gov (United States)

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  2. Neuro-fuzzy models as an IVIVR tool and their applicability in generic drug development.

    Science.gov (United States)

    Opara, Jerneja; Legen, Igor

    2014-03-01

    The usefulness of neuro-fuzzy (NF) models as an alternative in vitro-in vivo relationship (IVIVR) tool and as a support to quality by design (QbD) in generic drug development is presented. For drugs with complicated pharmacokinetics, immediate release drugs or nasal sprays, suggested level A correlations are not capable to satisfactorily describe the IVIVR. NF systems were recognized as a reasonable method in comparison to the published approaches for development of IVIVR. Consequently, NF models were built to predict 144 pharmacokinetic (PK) parameter ratios required for demonstration of bioequivalence (BE) for 88 pivotal BE studies. Input parameters of models included dissolution data and their combinations in different media, presence of food, formulation strength, technology type, particle size, and spray pattern for nasal sprays. Ratios of PK parameters Cmax or AUC were used as output variables. The prediction performance of models resulted in the following values: 79% of models have acceptable external prediction error (PE) below 10%, 13% of models have inconclusive PE between 10 and 20%, and remaining 8% of models show inadequate PE above 20%. Average internal predictability (LE) is 0.3%, and average external predictability of all models results in 7.7%. In average, models have acceptable internal and external predictabilities with PE lower than 10% and are therefore useful for IVIVR needs during formulation development, as a support to QbD and for the prediction of BE study outcome.

  3. SWAT Check: A Screening Tool to Assist Users in the Identification of Potential Model Application Problems.

    Science.gov (United States)

    White, Michael J; Harmel, R Daren; Arnold, Jeff G; Williams, Jimmy R

    2014-01-01

    The Soil and Water Assessment Tool (SWAT) is a basin-scale hydrologic model developed by the United States Department of Agriculture Agricultural Research Service. SWAT's broad applicability, user-friendly model interfaces, and automatic calibration software have led to a rapid increase in the number of new users. These advancements also allow less experienced users to conduct SWAT modeling applications. In particular, the use of automated calibration software may produce simulated values that appear appropriate because they adequately mimic measured data used in calibration and validation. Autocalibrated model applications (and often those of unexperienced modelers) may contain input data errors and inappropriate parameter adjustments not readily identified by users or the autocalibration software. The objective of this research was to develop a program to assist users in the identification of potential model application problems. The resulting "SWAT Check" is a stand-alone Microsoft Windows program that (i) reads selected SWAT output and alerts users of values outside the typical range; (ii) creates process-based figures for visualization of the appropriateness of output values, including important outputs that are commonly ignored; and (iii) detects and alerts users of common model application errors. By alerting users to potential model application problems, this software should assist the SWAT community in developing more reliable modeling applications. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  4. Modeling and Results for Creating Oblique Fields in a Magnetic Flux Leakage Survey Tool

    Science.gov (United States)

    Simek, James C.

    2010-02-01

    Integrity management programs designed to maintain safe pipeline systems quite often will use survey results from In line inspection (ILI) tools in addition to data from other sources. Commonly referred to a "smart pigs," one of the most widely used types are those based upon the magnetic flux leakage technique, typically used to detect and quantify metal loss zones. The majority of pipelines surveyed to date have used tools with the magnetic field direction axially aligned with the length of the pipeline. In order to enable detection and quantification of extremely narrow metal loss features or certain types of weld zone anomalies, tools employing magnetic circuits directing the magnetic fields around the pipe circumference have been designed and are use in segments where these feature categories are a primary concern. Modeling and laboratory test data of metal loss features will be used to demonstrate the response of extremely narrow metal loss zones as the features are rotated relative to the induced field direction. Based upon these results, the basis for developing a magnetizer capable of creating fields oblique to either pipeline axis will be presented along with the magnetic field profile models of several configurations.

  5. Bayesian reliability modeling and assessment solution for NC machine tools under small-sample data

    Science.gov (United States)

    Yang, Zhaojun; Kan, Yingnan; Chen, Fei; Xu, Binbin; Chen, Chuanhai; Yang, Chuangui

    2015-11-01

    Although Markov chain Monte Carlo(MCMC) algorithms are accurate, many factors may cause instability when they are utilized in reliability analysis; such instability makes these algorithms unsuitable for widespread engineering applications. Thus, a reliability modeling and assessment solution aimed at small-sample data of numerical control(NC) machine tools is proposed on the basis of Bayes theories. An expert-judgment process of fusing multi-source prior information is developed to obtain the Weibull parameters' prior distributions and reduce the subjective bias of usual expert-judgment methods. The grid approximation method is applied to two-parameter Weibull distribution to derive the formulas for the parameters' posterior distributions and solve the calculation difficulty of high-dimensional integration. The method is then applied to the real data of a type of NC machine tool to implement a reliability assessment and obtain the mean time between failures(MTBF). The relative error of the proposed method is 5.8020×10-4 compared with the MTBF obtained by the MCMC algorithm. This result indicates that the proposed method is as accurate as MCMC. The newly developed solution for reliability modeling and assessment of NC machine tools under small-sample data is easy, practical, and highly suitable for widespread application in the engineering field; in addition, the solution does not reduce accuracy.

  6. The RAMI On-line Model Checker (ROMC): A tool for the automated evaluation of canopy reflectance models.

    Science.gov (United States)

    Widlowski, J.-L.; Robustelli, M.; Taberner, M.; Pinty, B.; Rami Participants, All

    The Radiative transfer Model Intercomparison RAMI exercise was first launched in 1999 and then again in 2002 and 2005 RAMI aims at evaluating the performance of canopy reflectance models in absence of any absolute reference truth It does so by intercomparing models over a large ensemble of test cases under a variety of spectral and illumination conditions A series of criteria can be applied to select an ensemble of mutually agreeing 3-D Monte Carlo models to provide a surrogate truth against which all other models can then be compared We will present an overview of the RAMI activities and show how the results of the latest phase have lead to the development of the RAMI Online model checker ROMC This tool allows both model developers and users to evaluate the performance of their canopy reflectance models a against previous RAMI test cases whose results have already been published in the literature and b against test cases that are similar to the RAMI cases but for which no results will be known a priori As such the ROMC allows models to be debugged and or validated autonomously on a limited number of test cases RAMI-certified graphics that document a model s performance can be downloaded for future use in scientific presentations and or publications

  7. Modelling as a tool to redesign livestock farming systems: a literature review.

    Science.gov (United States)

    Gouttenoire, L; Cournut, S; Ingrand, S

    2011-12-01

    Livestock farming has recently come under close scrutiny, in response especially to environmental issues. Farmers are encouraged to redesign their livestock farming systems in depth to improve their sustainability. Assuming that modelling can be a relevant tool to address such systemic changes, we sought to answer the following question: 'How can livestock farming systems be modelled to help farmers redesign their whole farming systems?' To this end, we made a literature review of the models of livestock farming systems published from 2000 to mid-2009 (n = 79). We used an analysis grid based on three considerations: (i) system definition, (ii) the intended use of the model and (iii) the way in which farmers' decision-making processes were represented and how agricultural experts and farmers were involved in the modelling processes. Consistent rationales in approaches to supporting changes in livestock farming were identified in three different groups of models, covering 83% of the whole set. These could be defined according to (i) the way in which farmers' decisions were represented and (ii) the model's type of contribution to supporting changes. The first type gathered models that dynamically simulated the system according to different management options; the farmers' decision-making processes are assumed to consist in choosing certain values for management factors. Such models allow long-term simulations and endorse different disciplinary viewpoints, but the farmers are weakly involved in their design. Models of the second type can indicate the best combination of farm activities under given constraints, provided the farmers' objectives are profit maximisation. However, when used to support redesigning processes, they address neither how to implement the optimal solution nor its long-term consequences. Models of the third type enable users to dynamically simulate different options for the farming system, the management of which is assumed to be planned according

  8. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    Science.gov (United States)

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and

  9. Modeling of Neuronal Growth In Vitro: Comparison of Simulation Tools NETMORPH and CX3D

    Directory of Open Access Journals (Sweden)

    Aćimović J

    2011-01-01

    Full Text Available We simulate the growth of neuronal networks using the two recently published tools, NETMORPH and CX3D. The goals of the work are (1 to examine and compare the simulation tools, (2 to construct a model of growth of neocortical cultures, and (3 to characterize the changes in network connectivity during growth, using standard graph theoretic methods. Parameters for the neocortical culture are chosen after consulting both the experimental and the computational work presented in the literature. The first (three weeks in culture are known to be a time of development of extensive dendritic and axonal arbors and establishment of synaptic connections between the neurons. We simulate the growth of networks from day 1 to day 21. It is shown that for the properly selected parameters, the simulators can reproduce the experimentally obtained connectivity. The selected graph theoretic methods can capture the structural changes during growth.

  10. Modeling of Neuronal Growth In Vitro: Comparison of Simulation Tools NETMORPH and CX3D.

    Science.gov (United States)

    Aćimović, J; Mäki-Marttunen, T; Havela, R; Teppola, H; Linne, M-L

    2011-01-01

    We simulate the growth of neuronal networks using the two recently published tools, NETMORPH and CX3D. The goals of the work are (1) to examine and compare the simulation tools, (2) to construct a model of growth of neocortical cultures, and (3) to characterize the changes in network connectivity during growth, using standard graph theoretic methods. Parameters for the neocortical culture are chosen after consulting both the experimental and the computational work presented in the literature. The first (three) weeks in culture are known to be a time of development of extensive dendritic and axonal arbors and establishment of synaptic connections between the neurons. We simulate the growth of networks from day 1 to day 21. It is shown that for the properly selected parameters, the simulators can reproduce the experimentally obtained connectivity. The selected graph theoretic methods can capture the structural changes during growth.

  11. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  12. ModeRNA server: an online tool for modeling RNA 3D structures.

    Science.gov (United States)

    Rother, Magdalena; Milanowska, Kaja; Puton, Tomasz; Jeleniewicz, Jaroslaw; Rother, Kristian; Bujnicki, Janusz M

    2011-09-01

    The diverse functional roles of non-coding RNA molecules are determined by their underlying structure. ModeRNA server is an online tool for RNA 3D structure modeling by the comparative approach, based on a template RNA structure and a user-defined target-template sequence alignment. It offers an option to search for potential templates, given the target sequence. The server also provides tools for analyzing, editing and formatting of RNA structure files. It facilitates the use of the ModeRNA software and offers new options in comparison to the standalone program. ModeRNA server was implemented using the Python language and the Django web framework. It is freely available at http://iimcb.genesilico.pl/modernaserver. iamb@genesilico.pl.

  13. Model-based reasoning: using visual tools to reveal student learning.

    Science.gov (United States)

    Luckie, Douglas; Harrison, Scott H; Ebert-May, Diane

    2011-03-01

    Using visual models is common in science and should become more common in classrooms. Our research group has developed and completed studies on the use of a visual modeling tool, the Concept Connector. This modeling tool consists of an online concept mapping Java applet that has automatic scoring functions we refer to as Robograder. The Concept Connector enables students in large introductory science courses to visualize their thinking through online model building. The Concept Connector's flexible scoring system, based on tested grading schemes as well as instructor input, has enabled >1,000 physiology students to build maps of their ideas about plant and animal physiology with the guidance of automatic and immediate online scoring of homework. Criterion concept maps developed by instructors in this project contain numerous expert-generated or "correct" propositions connecting two concept words together with a linking phrase. In this study, holistic algorithms were used to test automated methods of scoring concept maps that might work as well as a human grader.

  14. Theoretical Tools and Software for Modeling, Simulation and Control Design of Rocket Test Facilities

    Science.gov (United States)

    Richter, Hanz

    2004-01-01

    A rocket test stand and associated subsystems are complex devices whose operation requires that certain preparatory calculations be carried out before a test. In addition, real-time control calculations must be performed during the test, and further calculations are carried out after a test is completed. The latter may be required in order to evaluate if a particular test conformed to specifications. These calculations are used to set valve positions, pressure setpoints, control gains and other operating parameters so that a desired system behavior is obtained and the test can be successfully carried out. Currently, calculations are made in an ad-hoc fashion and involve trial-and-error procedures that may involve activating the system with the sole purpose of finding the correct parameter settings. The goals of this project are to develop mathematical models, control methodologies and associated simulation environments to provide a systematic and comprehensive prediction and real-time control capability. The models and controller designs are expected to be useful in two respects: 1) As a design tool, a model is the only way to determine the effects of design choices without building a prototype, which is, in the context of rocket test stands, impracticable; 2) As a prediction and tuning tool, a good model allows to set system parameters off-line, so that the expected system response conforms to specifications. This includes the setting of physical parameters, such as valve positions, and the configuration and tuning of any feedback controllers in the loop.

  15. Simulation models: a current indispensable tool in studies of the continuous water-soil-plant - atmosphere

    International Nuclear Information System (INIS)

    Lopez Seijas, Teresa; Gonzalez, Felicita; Cid, G.; Osorio, Maria de los A.; Ruiz, Maria Elena

    2008-01-01

    Full text: This work assesses the current use of simulation models as a tool useful and indispensable for the advancement in the research and study of the processes related to the continuous water-soil - plant-atmosphere. In recent years they have reported in the literature many jobs where these modeling tools are used as a support to the decision-making process of companies or organizations in the agricultural sphere and in Special for the design of optimal management of irrigation and fertilization strategies of the crops. Summarizes some of the latest applications reported with respect to the use of water transfers and solutes, such simulation models mainly to nitrate leaching and groundwater contamination problems. On the other hand also summarizes important applications of simulation models of growth of cultivation for the prediction of effects on the performance of different conditions of water stress, and finally some other applications on the management of the different irrigation technologies as kingpins, superfiail irrigation and drip irrigation. Refer also the main work carried out in Cuba. (author)

  16. Concepts for a New Generation of Global Modelling Tools: Expanding our Capacity for Perception

    Directory of Open Access Journals (Sweden)

    Robert Hoffman

    2015-10-01

    Full Text Available It is now twenty years since the issues associated with the global 'problematique' were widely publicized in Limits to Growth, the pioneering study commissioned by the Club of Rome. In the meantime much has been written, but real action that might lead to a more harmonious and sustainable future has not been forthcoming. Indeed there is evidence that these issues are becoming even more threatening to humankind. There is an apparent inability of human societies to address the global problems of sustainability identified by the Club of Rome twenty years ago. This paper advocates the use of global modelling tools as a means of expanding our collective capacity for perception. What is proposed is not the development of another model but the establishment of a process consisting of the design and use of modelling tools to further the explication and communication of understanding, and thereby facilitating both individual and societal action. The proposed approach builds upon the strength of World Dynamics Model as a communications device and seeks to take advantage of scientific and technological advances of the past decades.

  17. Cost-benefit analysis model: A tool for area-wide fruit fly management. Procedures manual

    International Nuclear Information System (INIS)

    Enkerlin, W.; Mumford, J.; Leach, A.

    2007-03-01

    The Generic Fruit Fly Cost-Benefit Analysis Model assists in economic decision making associated with area-wide fruit fly control options. The FRUIT FLY COST-BENEFIT ANALYSIS PROGRAM (available on 1 CD-ROM from the Joint FAO/IAEA Programme of Nuclear Techniques in Food and Agriculture) is an Excel 2000 Windows based program, for which all standard Windows and Excel conventions apply. The Model is user friendly and thus largely self-explanatory. Nevertheless, it includes a procedures manual that has been prepared to guide the user, and thus should be used together with the software. Please note that the table presenting the pest management options in the Introductory Page of the model is controlled by spin buttons and click boxes. These controls are linked to macros that hide non relevant tables and boxes. N.B. it is important that the medium level of security is selected from the Tools menu of Excel, to do this go to Tools|Macros|Security| and select Medium. When the file is opened a form will appear containing three buttons, click on the middle button, 'Enable Macros', so that the macros may be used. Ideally the model should be used as a support tool by working groups aiming at assessing the economic returns of different fruit fly control options (suppression, eradication, containment and prevention). The working group should include professionals in agriculture with experience in area-wide implementation of integrated pest management programmes, an economist or at least someone with basic knowledge in economics, and if relevant, an entomologist with some background in the application of the sterile insect technique (SIT)

  18. Performance evaluation of paper embossing tools produced by fused deposition modelling additive manufacturing technology

    Directory of Open Access Journals (Sweden)

    Gordana Delić

    2017-12-01

    Full Text Available From its beginnings, up to a few years ago, additive manufacturing technology was able to produce models or prototypes which have limited use, because of materials mechanical properties. With advancement and invention of new materials, this is changing. Now, it is possible to create 3D prints that can be used as final products or functional tools, using technology and materials with low environmental impact. The goal of this study was to examine opportunities for production of paper embossing tools by fused deposition modelling (FDM 3D printing. This study emphasises the use of environmentally friendly poly-lactic acid (PLA materials in FDM technology, contrary to the conventional method using metal alloys and acids. Embossing of line elements and letters using 3D printed embossing tools was done on six different types of paper. Embossing force was applied using SHIMADZU EZ-LX Compact Tabletop Testing Machine. Each type of paper was repeatedly embossed using different values of embossing force (in 250 N increments, starting at 1000 N to determine the optimal embossing force for each specific paper type. When determined, the optimal embossing force was used on ten samples for each paper type. Results of embossing were analysed and evaluated. The analysis consisted of investigating the effects of the applied embossing force and characteristics such as paper basis weight, paper structure, surface characteristic and fibre direction of the paper. Results show that paper characteristics determine the embossing force required for achieving a good embossing result. This means that with the right amount of embossing force, letters and borderlines can be equally well formed by the embossing process regardless of paper weight, surface characteristics, etc. Embossing tools produced in this manner can be used in case of the embossing elements that are not complex. The reason for this is the limitation of FDM technology and lack of precision needed for fine

  19. Predictive Models and Tools for Screening Chemicals under TSCA: Consumer Exposure Models 1.5

    Science.gov (United States)

    CEM contains a combination of models and default parameters which are used to estimate inhalation, dermal, and oral exposures to consumer products and articles for a wide variety of product and article use categories.

  20. A tool for efficient, model-independent management optimization under uncertainty

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  1. Proc. of the Workshop on Agent Simulation : Applications, Models, and Tools, Oct. 15-16, 1999

    International Nuclear Information System (INIS)

    Macal, C. M.; Sallach, D.

    2000-01-01

    The many motivations for employing agent-based computation in the social sciences are reviewed. It is argued that there exist three distinct uses of agent modeling techniques. One such use-the simplest-is conceptually quite close to traditional simulation in operations research. This use arises when equations can be formulated that completely describe a social process, and these equations are explicitly soluble, either analytically or numerically. In the former case, the agent model is merely a tool for presenting results, while in the latter it is a novel kind of Monte Carlo analysis. A second, more commonplace usage of computational agent models arises when mathematical models can be written down but not completely solved. In this case the agent-based model can shed significant light on the solution structure, illustrate dynamical properties of the model, serve to test the dependence of results on parameters and assumptions, and be a source of counter-examples. Finally, there are important classes of problems for which writing down equations is not a useful activity. In such circumstances, resort to agent-based computational models may be the only way available to explore such processes systematically, and constitute a third distinct usage of such models

  2. An Executable Architecture Tool for the Modeling and Simulation of Operational Process Models

    Science.gov (United States)

    2015-03-16

    network-based fuzzy logic control and decision system,” IEEE Trans. Comput., vol. 40, no. 12, pp. 1320–1336, 1991. [16] M. Beale, M. Hagan, and H...such as models based on neural networks [14]–[16], or genetic algorithms [17] to represent activities in the process flow. Furthermore, since the model...particularly relevant to experiments and exercises. The operational views provide a logical description of the activities and information exchanged

  3. Econometric model as a regulatory tool in electricity distribution - Case Network Performance Assessment Model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost-effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent available at the time when analysis was done. However, since NPAM is under development, the parameters have been constantly changing. Therefore slightly changes in the results can occur if calculations were made with latest parameters. However, main conclusions are same and do not depend on exact parameters. (orig.)

  4. Econometric model as a regulatory tool in electricity distribution. Case network performance assessment model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost- effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent ones available at the time when analysis was done. However, since NPAM have been under development, the parameters have been constantly changing. Therefore slight changes might occur in the numerical results of calculations if they were made with the latest set of parameters. However, main conclusions are same and do not depend on exact parameters

  5. A flexible, interactive software tool for fitting the parameters of neuronal models

    Directory of Open Access Journals (Sweden)

    Péter eFriedrich

    2014-07-01

    Full Text Available The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problem of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting

  6. A finite element head and neck model as a supportive tool for deformable image registration.

    Science.gov (United States)

    Kim, Jihun; Saitou, Kazuhiro; Matuszak, Martha M; Balter, James M

    2016-07-01

    A finite element (FE) head and neck model was developed as a tool to aid investigations and development of deformable image registration and patient modeling in radiation oncology. Useful aspects of a FE model for these purposes include ability to produce realistic deformations (similar to those seen in patients over the course of treatment) and a rational means of generating new configurations, e.g., via the application of force and/or displacement boundary conditions. The model was constructed based on a cone-beam computed tomography image of a head and neck cancer patient. The three-node triangular surface meshes created for the bony elements (skull, mandible, and cervical spine) and joint elements were integrated into a skeletal system and combined with the exterior surface. Nodes were additionally created inside the surface structures which were composed of the three-node triangular surface meshes, so that four-node tetrahedral FE elements were created over the whole region of the model. The bony elements were modeled as a homogeneous linear elastic material connected by intervertebral disks. The surrounding tissues were modeled as a homogeneous linear elastic material. Under force or displacement boundary conditions, FE analysis on the model calculates approximate solutions of the displacement vector field. A FE head and neck model was constructed that skull, mandible, and cervical vertebrae were mechanically connected by disks. The developed FE model is capable of generating realistic deformations that are strain-free for the bony elements and of creating new configurations of the skeletal system with the surrounding tissues reasonably deformed. The FE model can generate realistic deformations for skeletal elements. In addition, the model provides a way of evaluating the accuracy of image alignment methods by producing a ground truth deformation and correspondingly simulated images. The ability to combine force and displacement conditions provides

  7. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  8. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  9. Stochastic ecological network occupancy (SENO) models: a new tool for modeling ecological networks across spatial scales

    Science.gov (United States)

    Lafferty, Kevin D.; Dunne, Jennifer A.

    2010-01-01

    Stochastic ecological network occupancy (SENO) models predict the probability that species will occur in a sample of an ecological network. In this review, we introduce SENO models as a means to fill a gap in the theoretical toolkit of ecologists. As input, SENO models use a topological interaction network and rates of colonization and extinction (including consumer effects) for each species. A SENO model then simulates the ecological network over time, resulting in a series of sub-networks that can be used to identify commonly encountered community modules. The proportion of time a species is present in a patch gives its expected probability of occurrence, whose sum across species gives expected species richness. To illustrate their utility, we provide simple examples of how SENO models can be used to investigate how topological complexity, species interactions, species traits, and spatial scale affect communities in space and time. They can categorize species as biodiversity facilitators, contributors, or inhibitors, making this approach promising for ecosystem-based management of invasive, threatened, or exploited species.

  10. Quantum discord as a tool for comparing collapse models and decoherence

    International Nuclear Information System (INIS)

    Banerjee, Shreya; Bera, Sayantani; Singh, Tejinder P.

    2016-01-01

    Highlights: • Collapse and decoherence models have been compared using quantum discord. • A macroscopic entanglement experimental set up has been used for this purpose. • Detection of the above effects with present experimental time not possible. • Bounds on the collapse parameters have been obtained from this analysis. - Abstract: The quantum to classical transition maybe caused by decoherence or by dynamical collapse of the wave-function. We propose quantum discord as a tool, (1) for comparing and contrasting the role of a collapse model (Continuous Spontaneous Localisation) and various sources of decoherence (environmental and fundamental), (2) for detecting collapse model and fundamental decoherence for an experimentally demonstrated macroscopic entanglement. We discuss the experimental times which will lead to the detection of either Continuous Spontaneous Localisation or fundamental decoherence. We further put bounds on the collapse parameters from this experiment for quantum discord.

  11. Dynamic wind turbine models in power system simulation tool DIgSILENT

    DEFF Research Database (Denmark)

    Hansen, Anca Daniela; Iov, F.; Sørensen, Poul Ejnar

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... speed doubly-fed induction generator wind turbine concept 3. Variable speed multi-pole permanent magnet synchronous generator wind turbine concept These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies......, connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control...

  12. Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system

    Directory of Open Access Journals (Sweden)

    Daniel Brüderle

    2009-06-01

    Full Text Available Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated.

  13. OCAM - A CELSS modeling tool: Description and results. [Object-oriented Controlled Ecological Life Support System Analysis and Modeling

    Science.gov (United States)

    Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray

    1992-01-01

    Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.

  14. N2A: a computational tool for modeling from neurons to algorithms

    Directory of Open Access Journals (Sweden)

    Fredrick eRothganger

    2014-01-01

    Full Text Available The exponential increase in available neural data has combined with the exponential growth in computing (Moore’s law to create new opportunities to understand neural systems at large scale and high detail. The ability to produce large and sophisticated simulations has introduced unique challenges to neuroscientists. Computational models in neuroscience are increasingly broad efforts, often involving the collaboration of experts in different domains. Furthermore, the size and detail of models have grown to levels for which understanding the implications of variability and assumptions is no longer trivial. Here, we introduce the model design platform N2A which aims to facilitate the design and validation of biologically realistic models. N2A uses a hierarchical representation of neural information to enable the integration of models from different users. N2A streamlines computational validation of a model by natively implementing standard tools in sensitivity analysis and uncertainty quantification. The part-relationship representation allows both network-level analysis and dynamical simulations. We will demonstrate how N2A can be used in a range of examples, including a simple Hodgkin-Huxley cable model, basic parameter sensitivity of an 80/20 network, and the expression of the structural plasticity of a growing dendrite and stem cell proliferation and differentiation.

  15. Urban Stormwater Management Model and Tools for Designing Stormwater Management of Green Infrastructure Practices

    Science.gov (United States)

    Haris, H.; Chow, M. F.; Usman, F.; Sidek, L. M.; Roseli, Z. A.; Norlida, M. D.

    2016-03-01

    Urbanization is growing rapidly in Malaysia. Rapid urbanization has known to have several negative impacts towards hydrological cycle due to decreasing of pervious area and deterioration of water quality in stormwater runoff. One of the negative impacts of urbanization is the congestion of the stormwater drainage system and this situation leading to flash flood problem and water quality degradation. There are many urban stormwater management softwares available in the market such as Storm Water Drainage System design and analysis program (DRAINS), Urban Drainage and Sewer Model (MOUSE), InfoWorks River Simulation (InfoWork RS), Hydrological Simulation Program-Fortran (HSPF), Distributed Routing Rainfall-Runoff Model (DR3M), Storm Water Management Model (SWMM), XP Storm Water Management Model (XPSWMM), MIKE-SWMM, Quality-Quantity Simulators (QQS), Storage, Treatment, Overflow, Runoff Model (STORM), and Hydrologic Engineering Centre-Hydrologic Modelling System (HEC-HMS). In this paper, we are going to discuss briefly about several softwares and their functionality, accessibility, characteristics and components in the quantity analysis of the hydrological design software and compare it with MSMA Design Aid and Database. Green Infrastructure (GI) is one of the main topics that has widely been discussed all over the world. Every development in the urban area is related to GI. GI can be defined as green area build in the develop area such as forest, park, wetland or floodway. The role of GI is to improve life standard such as water filtration or flood control. Among the twenty models that have been compared to MSMA SME, ten models were selected to conduct a comprehensive review for this study. These are known to be widely accepted by water resource researchers. These ten tools are further classified into three major categories as models that address the stormwater management ability of GI in terms of quantity and quality, models that have the capability of conducting the

  16. Comparison of four modeling tools for the prediction of potential distribution for non-indigenous weeds in the United States

    Science.gov (United States)

    Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony

    2018-01-01

    This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.

  17. GraphCrunch 2: Software tool for network modeling, alignment and clustering

    Directory of Open Access Journals (Sweden)

    Hayes Wayne

    2011-01-01

    Full Text Available Abstract Background Recent advancements in experimental biotechnology have produced large amounts of protein-protein interaction (PPI data. The topology of PPI networks is believed to have a strong link to their function. Hence, the abundance of PPI data for many organisms stimulates the development of computational techniques for the modeling, comparison, alignment, and clustering of networks. In addition, finding representative models for PPI networks will improve our understanding of the cell just as a model of gravity has helped us understand planetary motion. To decide if a model is representative, we need quantitative comparisons of model networks to real ones. However, exact network comparison is computationally intractable and therefore several heuristics have been used instead. Some of these heuristics are easily computable "network properties," such as the degree distribution, or the clustering coefficient. An important special case of network comparison is the network alignment problem. Analogous to sequence alignment, this problem asks to find the "best" mapping between regions in two networks. It is expected that network alignment might have as strong an impact on our understanding of biology as sequence alignment has had. Topology-based clustering of nodes in PPI networks is another example of an important network analysis problem that can uncover relationships between interaction patterns and phenotype. Results We introduce the GraphCrunch 2 software tool, which addresses these problems. It is a significant extension of GraphCrunch which implements the most popular random network models and compares them with the data networks with respect to many network properties. Also, GraphCrunch 2 implements the GRAph ALigner algorithm ("GRAAL" for purely topological network alignment. GRAAL can align any pair of networks and exposes large, dense, contiguous regions of topological and functional similarities far larger than any other

  18. DESTINY: A Comprehensive Tool with 3D and Multi-Level Cell Memory Modeling Capability

    Directory of Open Access Journals (Sweden)

    Sparsh Mittal

    2017-09-01

    Full Text Available To enable the design of large capacity memory structures, novel memory technologies such as non-volatile memory (NVM and novel fabrication approaches, e.g., 3D stacking and multi-level cell (MLC design have been explored. The existing modeling tools, however, cover only a few memory technologies, technology nodes and fabrication approaches. We present DESTINY, a tool for modeling 2D/3D memories designed using SRAM, resistive RAM (ReRAM, spin transfer torque RAM (STT-RAM, phase change RAM (PCM and embedded DRAM (eDRAM and 2D memories designed using spin orbit torque RAM (SOT-RAM, domain wall memory (DWM and Flash memory. In addition to single-level cell (SLC designs for all of these memories, DESTINY also supports modeling MLC designs for NVMs. We have extensively validated DESTINY against commercial and research prototypes of these memories. DESTINY is very useful for performing design-space exploration across several dimensions, such as optimizing for a target (e.g., latency, area or energy-delay product for a given memory technology, choosing the suitable memory technology or fabrication method (i.e., 2D v/s 3D for a given optimization target, etc. We believe that DESTINY will boost studies of next-generation memory architectures used in systems ranging from mobile devices to extreme-scale supercomputers. The latest source-code of DESTINY is available from the following git repository: https://bitbucket.org/sparshmittal/destinyv2.

  19. Basic data, computer codes and integral experiments: The tools for modelling in nuclear technology

    International Nuclear Information System (INIS)

    Sartori, E.

    2001-01-01

    When studying applications in nuclear technology we need to understand and be able to predict the behavior of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are: - The basic nuclear or chemical data; - The computer codes, and; - The integral experiments. This article describes the role each component plays in a computational scheme designed for modelling purposes. It describes also which tools have been developed and are internationally available. The role of the OECD/NEA Data Bank, the Radiation Shielding Information Computational Center (RSICC), and the IAEA Nuclear Data Section are playing in making these elements available to the community of scientists and engineers is described. (author)

  20. Validation of Align Technology's Treat III digital model superimposition tool and its case application.

    Science.gov (United States)

    Miller, R J; Kuo, E; Choi, W

    2003-01-01

    An assessment of the efficacy and accuracy of three-dimensional computer-based predictive orthodontic systems requires that new methods of treatment analysis be developed and validated. Invisalign is a digitally fabricated, removable orthodontic appliance that has been commercially available since 1999. It is made up of two main components: 1) computerized graphical images of a patient's teeth moving through a series of stages from initial to final position; 2) pressure formed clear plastic appliances made from stereolithography models of the images in the first component. The manufacturer of Invisalign (Align Technology, Inc.) has created a software tool that can be used to superimpose digital models to evaluate treatment outcomes in three dimensions. Using this software, research was conducted to determine if a single operator could repeatedly superimpose two identical digital models using 12 selected points from the palatal rugae over 10 trials. The tool was then applied to one subject's orthodontic treatment. EXPERIMENT VARIABLES: The output from this tool includes rotations, translations and morphological changes. For this study, translations and rotations were chosen. The results showed that the digital superimposition was reproducible, and that after multiple trials, the superimposition error decreased. The average error in x, y, z, Rx, Ry and Rz after 10 trials was determined to approach approximately 0.2 mm in translation and less than 1 degree in rotation, with a standard deviation of 0.15 mm and 0.7 mm, respectively. The treatment outcome from a single Invisalign-treated bicuspid extraction case was also evaluated tooth-by-tooth in x, y, z, Rx, Ry and Rz dimensions. Using the palate, as a stable reference seemed to work well and the evaluation of the single case showed that many, but not all, of the planned movements occurred.

  1. Danish heat atlas as a support tool for energy system models

    International Nuclear Information System (INIS)

    Petrovic, Stefan N.; Karlsson, Kenneth B.

    2014-01-01

    Highlights: • The GIS method for calculating costs of district heating expansion is presented. • High socio-economic potential for district heating is identified within urban areas. • The method for coupling a heat atlas and TIMES optimization model is proposed. • Presented methods can be used for any geographical region worldwide. - Abstract: In the past four decades following the global oil crisis in 1973, Denmark has implemented remarkable changes in its energy sector, mainly due to the energy conservation measures on the demand side and the energy efficiency improvements on the supply side. Nowadays, the capital intensive infrastructure investments, such as the expansion of district heating networks and the introduction of significant heat saving measures require highly detailed decision-support tool. A Danish heat atlas provides highly detailed database with extensive information about more than 2.5 million buildings in Denmark. Energy system analysis tools incorporate environmental, economic, energy and engineering analysis of future energy systems and are considered crucial for the quantitative assessment of transitional scenarios towards future milestones, such as EU 2020 goals and Denmark’s goal of achieving fossil free society after 2050. The present paper shows how a Danish heat atlas can be used for providing inputs to energy system models, especially related to the analysis of heat saving measures within building stock and expansion of district heating networks. As a result, marginal cost curves are created, approximated and prepared for the use in optimization energy system model. Moreover, it is concluded that heat atlas can contribute as a tool for data storage and visualisation of results

  2. Cost Benefit Analysis Modeling Tool for Electric vs. ICE Airport Ground Support Equipment – Development and Results

    Energy Technology Data Exchange (ETDEWEB)

    James Francfort; Kevin Morrow; Dimitri Hochard

    2007-02-01

    This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.

  3. Pharmacokinetic-pharmacodynamic modeling of dopamine D2 receptor occupancy in humans using Bayesian modeling tools

    NARCIS (Netherlands)

    Johnson, Martin; Mafirakureva, Nyashadzaishe; Kozielska, Magdalena; Pilla Reddy, Venkatesh; Vermeulen, An; Liu, Jing; de Greef, Rik; Groothuis, Genoveva; Danhof, Meindert; Proost, Johannes

    2011-01-01

    Objectives: Blockade of dopamine-2 receptors is the key pharmacological component to the antipsychotic efficacy of both the typical and atypical antipsychotics (1). A pharmacokinetic-pharmacodynamic (PK-PD) modeling approach was used to describe the relationship between the plasma concentration of

  4. VIRTUAL MODELING OF A NUMERICAL CONTROL MACHINE TOOL USED FOR COMPLEX MACHINING OPERATIONS

    Directory of Open Access Journals (Sweden)

    POPESCU Adrian

    2015-11-01

    Full Text Available This paper presents the 3D virtual model of the numerical control machine Modustar 100, in terms of machine elements. This is a CNC machine of modular construction, all components allowing the assembly in various configurations. The paper focused on the design of the subassemblies specific to the axes numerically controlled by means of CATIA v5, which contained different drive kinematic chains of different translation modules that ensures translation on X, Y and Z axis. Machine tool development for high speed and highly precise cutting demands employment of advanced simulation techniques witch it reflect on cost of total development of the machine.

  5. Uncle Tony's computer: order-of-magnitude modelling as a screening tool in environmental analysis

    CSIR Research Space (South Africa)

    Scholes, RJ

    2002-09-01

    Full Text Available : order-of-magnitude modelling as a screening tool in environmental analysis R.J. Scholes Introduction When Tony Starfield took up his position at the University of Minnesota, he wanted to stay in contact with his young nephews in South Africa... study. The method assumes a good, but not highly quantitative, knowledge of the system, and does not apply to cumulative effects. *CSIR Environmentek, P.O. Box 395, Pretoria 0001, South Africa. E-mail: bscholes@csir.co.za. production sources or loss...

  6. Models, methods and software tools to evaluate the quality of informational and educational resources

    International Nuclear Information System (INIS)

    Gavrilov, S.I.

    2011-01-01

    The paper studies the modern methods and tools to evaluate the quality of data systems, which allows determining the specificity of informational and educational resources (IER). The author has developed a model of IER quality management at all stages of the life cycle and an integrated multi-level hierarchical system of IER quality assessment, taking into account both information properties and targeted resource assignment. The author presents a mathematical and algorithmic justification of solving the problem of IER quality management, and offers data system to assess the IER quality [ru

  7. Space Weather Data Dissemination Tools from the Community Coordinated Modeling Center

    Science.gov (United States)

    Donti, N.; Berrios, D.; Boblitt, J.; LaSota, J.; Maddox, M. M.; Mullinix, R.; Hesse, M.

    2011-12-01

    The Community Coordinated Modeling Center (CCMC) at NASA Goddard Space Flight Center has developed new space weather data dissemination products. These include a Java-based conversion software for space weather simulation data, an interactive and customizable timeline tool for time series data, and Android phone and tablet versions of the NASA Space Weather App for mobile devices. We highlight the new features of all the updated services, discuss the back-end capabilities required to realize these services, and talk about future services in development.

  8. ANALYTICAL MODEL FOR LATHE TOOL DISPLACEMENTS CALCULUS IN THE MANUFACTURING P ROCESS

    Directory of Open Access Journals (Sweden)

    Catălin ROŞU

    2014-01-01

    Full Text Available In this paper, we present an analytical model for lathe tools displacements calculus in the manufacturing process. We will present step by step the methodology for the displacements calculus and in the end we will insert these relations in a program for automatic calculus and we extract the conclusions. There is taken into account only the effects of the bending moments (because these insert the highest displacements. The simplifying assumptions and the calculus relations for the displacements (linea r and angular ones are presented in an original way.

  9. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    Science.gov (United States)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  10. A tool to convert CAD models for importation into Geant4

    Science.gov (United States)

    Vuosalo, C.; Carlsmith, D.; Dasu, S.; Palladino, K.; LUX-ZEPLIN Collaboration

    2017-10-01

    The engineering design of a particle detector is usually performed in a Computer Aided Design (CAD) program, and simulation of the detector’s performance can be done with a Geant4-based program. However, transferring the detector design from the CAD program to Geant4 can be laborious and error-prone. SW2GDML is a tool that reads a design in the popular SOLIDWORKS CAD program and outputs Geometry Description Markup Language (GDML), used by Geant4 for importing and exporting detector geometries. Other methods for outputting CAD designs are available, such as the STEP format, and tools exist to convert these formats into GDML. However, these conversion methods produce very large and unwieldy designs composed of tessellated solids that can reduce Geant4 performance. In contrast, SW2GDML produces compact, human-readable GDML that employs standard geometric shapes rather than tessellated solids. This paper will describe the development and current capabilities of SW2GDML and plans for its enhancement. The aim of this tool is to automate importation of detector engineering models into Geant4-based simulation programs to support rapid, iterative cycles of detector design, simulation, and optimization.

  11. 3D geological and hydrogeological modeling as design tools for the Conawapa generating station

    Energy Technology Data Exchange (ETDEWEB)

    Mann, J.; Sharif, S.; Smith, B. [KGS Group, Winnipeg, MB (Canada); Cook, G.N.; Osiowy, B.J. [Manitoba Hydro, Winnipeg, MB (Canada)

    2008-07-01

    Following the project's suspension in the early 1990s, part of Manitoba Hydro's recommitment study involved digital modeling of geological and hydrogeological data for the foundation design and analysis of the proposed Conawapa generating station in northern Manitoba. Three-dimensional geological and hydrogeological models have been developed to consolidate and improve the designer's ability to understand all of the information, and to assist in developing engineering alternatives which will improve the overall confidence of the design. The tools are also being leveraged for use in environmental studies. This paper provided an overview of the Conawapa site and 3-dimensional modeling goals. It described the geology and hydrogeology of the Conawapa site as well as the bedrock structure and Karst development. The paper also presented the central concepts of 3-dimensional modeling studies, including the flow of information from database to modeling software platforms. The construction of the Conawapa geological model was also presented, with particular reference to an overview of the MVS software; mesh design; and model buildup logic. The construction of the Conawapa hydrogeological model was discussed in terms of the finite element code FEFLOW software; conceptual model design; and initial observations of Conawapa groundwater flow modeling. It was concluded that recent advancement and application of 3-dimensional geological visualization software to engineering and environmental projects, including at the future Conawapa site using MVS and FEFLOW, have shown that complicated geological data can be organized, displayed, and analysed in a systematic way, to improve site visualization, understanding, and data relationships. 19 refs., 9 figs.

  12. LANL12-RS-108J Report on Device Modeler Testing of the Device Modeler Tool Kit. DMTK in FY14

    Energy Technology Data Exchange (ETDEWEB)

    Temple, Brian Allen [Los Alamos National Laboratory (LANL), Los Alamos, NM (United States); Pimentel, David A. [Los Alamos National Laboratory (LANL), Los Alamos, NM (United States)

    2014-09-28

    This document covers the various testing and modifications of the Device Modeler Tool Kit (DMTK) for project LANL12-RS-108J in FY14. The testing has been comprised of different device modelers and trainees for device modeling using DMTK on the secure network for a few test problems. Most of these problems have been synthetic data problems. There has been a local secure network training drill where one of the trainees has used DMTK for real data. DMTK has also been used on a laptop for a deployed real data training drill. Once DMTK gets into the home team, it will be used for more training drills (TDs) which will contain real data in the future.

  13. Validated assessment tool paves the way for standardized evaluation of trainees on anastomotic models.

    Science.gov (United States)

    Duran, Cassidy A; Shames, Murray; Bismuth, Jean; Lee, Jason T

    2014-01-01

    Simulation modules allow for the safe practice of certain techniques and are becoming increasingly important in the shift toward education for integrated vascular residents. There is an unquestionable need to standardize the evaluation of trainees on these simulation models to assure their impact and effectiveness. We sought to validate such an assessment tool for a basic open vascular technique. Vascular fellows, integrated vascular residents, and general surgery residents attending Society for Clinical Vascular Surgery, Introduction to Academic Vascular Surgery, and Methodist Boot Camp in 2012 were asked to participate in an assessment model using multiple anastomotic models and given 20 minutes to complete an end-to-side anastomosis. Trained vascular faculty evaluated subjects using an assessment tool that included a 25-point checklist and a graded overall global rating scale (GRS) on a 5-point Likert scale with 8 parameters. Self-assessment using the GRS was performed by 20 trainees. Reliability and construct validity were evaluated. Ninety-two trainees were assessed. There was excellent agreement between assessors on 21 of the 25 items, with 2 items found not to be relevant for the bench-top model. Graders agreed that the checklist was prohibitively cumbersome to use. Scores on the global assessments correlated with experience and were higher for the senior trainees, with median global summary scores increasing by postgraduate year. Reliability was confirmed through interrater correlation and internal consistency. Internal consistency was 0.92 for the GRS. There was poor correlation between grades given by the expert observers and the self-assessment from the trainee, but good correlation between scores assigned by faculty. Assessment of appropriate hemostasis was poor, which likely reflects the difficulty of evaluating this parameter in the current inanimate model. Performance on an open simulation model evaluated by a standardized global rating scale

  14. Online and Certifiable Spectroscopy Courses Using Information and Communication Tools. a Model for Classrooms and Beyond

    Science.gov (United States)

    Krishnan, Mangala Sunder

    2015-06-01

    Online education tools and flipped (reverse) class models for teaching and learning and pedagogic and andragogic approaches to self-learning have become quite mature in the last few years because of the revolution in video, interactive software and social learning tools. Open Educational resources of dependable quality and variety are also becoming available throughout the world making the current era truly a renaissance period for higher education using Internet. In my presentation, I shall highlight structured course content preparation online in several areas of spectroscopy and also the design and development of virtual lab tools and kits for studying optical spectroscopy. Both elementary and advanced courses on molecular spectroscopy are currently under development jointly with researchers in other institutions in India. I would like to explore participation from teachers throughout the world in the teaching-learning process using flipped class methods for topics such as experimental and theoretical microwave spectroscopy of semi-rigid and non-rigid molecules, molecular complexes and aggregates. In addition, courses in Raman, Infrared spectroscopy experimentation and advanced electronic spectroscopy courses are also envisaged for free, online access. The National Programme on Technology Enhanced Learning (NPTEL) and the National Mission on Education through Information and Communication Technology (NMEICT) are two large Government of India funded initiatives for producing certified and self-learning courses with financial support for moderated discussion forums. The learning tools and interactive presentations so developed can be used in classrooms throughout the world using flipped mode of teaching. They are very much sought after by learners and researchers who are in other areas of learning but want to contribute to research and development through inter-disciplinary learning. NPTEL is currently is experimenting with Massive Open Online Course (MOOC

  15. A Microscale Modeling Tool for the Design and Optimization of Solid Oxide Fuel Cells

    Directory of Open Access Journals (Sweden)

    Shixue Liu

    2009-06-01

    Full Text Available A two dimensional numerical model of a solid oxide fuel cell (SOFC with electrode functional layers is presented. The model incorporates the partial differential equations for mass transport, electric conduction and electrochemical reactions in the electrode functional layers, the anode support layer, the cathode current collection layer and at the electrode/electrolyte interfaces. A dusty gas model is used in modeling the gas transport in porous electrodes. The model is capable of providing results in good agreement with the experimental I-V relationship. Numerical examples are presented to illustrate the applications of this numerical model as a tool for the design and optimization of SOFCs. For a stack assembly of a pitch width of 2 mm and an interconnect-electrode contact resistance of 0.025 Ωcm2, a typical SOFC stack cell should consist of a rib width of 0.9 mm, a cathode current collection layer thickness of 200–300 μm, a cathode functional layer thickness of 20–40 μm, and an anode functional layer thickness of 10–20 μm in order to achieve optimal performance.

  16. Structure Based Thermostability Prediction Models for Protein Single Point Mutations with Machine Learning Tools.

    Directory of Open Access Journals (Sweden)

    Lei Jia

    Full Text Available Thermostability issue of protein point mutations is a common occurrence in protein engineering. An application which predicts the thermostability of mutants can be helpful for guiding decision making process in protein design via mutagenesis. An in silico point mutation scanning method is frequently used to find "hot spots" in proteins for focused mutagenesis. ProTherm (http://gibk26.bio.kyutech.ac.jp/jouhou/Protherm/protherm.html is a public database that consists of thousands of protein mutants' experimentally measured thermostability. Two data sets based on two differently measured thermostability properties of protein single point mutations, namely the unfolding free energy change (ddG and melting temperature change (dTm were obtained from this database. Folding free energy change calculation from Rosetta, structural information of the point mutations as well as amino acid physical properties were obtained for building thermostability prediction models with informatics modeling tools. Five supervised machine learning methods (support vector machine, random forests, artificial neural network, naïve Bayes classifier, K nearest neighbor and partial least squares regression are used for building the prediction models. Binary and ternary classifications as well as regression models were built and evaluated. Data set redundancy and balancing, the reverse mutations technique, feature selection, and comparison to other published methods were discussed. Rosetta calculated folding free energy change ranked as the most influential features in all prediction models. Other descriptors also made significant contributions to increasing the accuracy of the prediction models.

  17. TS07D Empirical Geomagnetic Field Model as a Space Weather Tool

    Science.gov (United States)

    Sharp, N. M.; Stephens, G. K.; Sitnov, M. I.

    2011-12-01

    Empirical modeling and forecasting of the geomagnetic field is a key element of the space weather research. A dramatic increase in the number of data available for the terrestrial magnetosphere required a new generation of empirical models with large numbers of degrees of freedom and sophisticated data-mining techniques. A set of the corresponding data binning, fitting and visualization procedures known as the TS07D model is now available at \\url{http://geomag_field.jhuapl.edu/model/} and it is used for detailed investigation of storm-scale phenomena in the magnetosphere. However, the transformation of this research model into a practical space weather application, which implies its extensive running for validation and interaction with other space weather codes, requires its presentation in the form of a single state-of-the-art code, well documented and optimized for the highest performance. To this end, the model is implemented in the Java programming language with extensive self-sufficient library and a set of optimization tools, including multi-thread operations that assume the use of the code in multi-core computers and clusters. The results of the new code validation and optimization of its binning, fitting and visualization parts are presented as well as some examples of the processed storms are discussed.

  18. Multirule Based Diagnostic Approach for the Fog Predictions Using WRF Modelling Tool

    Directory of Open Access Journals (Sweden)

    Swagata Payra

    2014-01-01

    Full Text Available The prediction of fog onset remains difficult despite the progress in numerical weather prediction. It is a complex process and requires adequate representation of the local perturbations in weather prediction models. It mainly depends upon microphysical and mesoscale processes that act within the boundary layer. This study utilizes a multirule based diagnostic (MRD approach using postprocessing of the model simulations for fog predictions. The empiricism involved in this approach is mainly to bridge the gap between mesoscale and microscale variables, which are related to mechanism of the fog formation. Fog occurrence is a common phenomenon during winter season over Delhi, India, with the passage of the western disturbances across northwestern part of the country accompanied with significant amount of moisture. This study implements the above cited approach for the prediction of occurrences of fog and its onset time over Delhi. For this purpose, a high resolution weather research and forecasting (WRF model is used for fog simulations. The study involves depiction of model validation and postprocessing of the model simulations for MRD approach and its subsequent application to fog predictions. Through this approach model identified foggy and nonfoggy days successfully 94% of the time. Further, the onset of fog events is well captured within an accuracy of 30–90 minutes. This study demonstrates that the multirule based postprocessing approach is a useful and highly promising tool in improving the fog predictions.

  19. Phase Behavior Modeling of Asphaltene Precipitation for Heavy Crudes: A Promising Tool Along with Experimental Data

    Science.gov (United States)

    Tavakkoli, M.; Kharrat, R.; Masihi, M.; Ghazanfari, M. H.; Fadaei, S.

    2012-12-01

    Thermodynamic modeling is known as a promising tool for phase behavior modeling of asphaltene precipitation under different conditions such as pressure depletion and CO2 injection. In this work, a thermodynamic approach is used for modeling the phase behavior of asphaltene precipitation. The precipitated asphaltene phase is represented by an improved solid model, while the oil and gas phases are modeled with an equation of state. The PR-EOS was used to perform flash calculations. Then, the onset point and the amount of precipitated asphaltene were predicted. A computer code based on an improved solid model has been developed and used for predicting asphaltene precipitation data for one of Iranian heavy crudes, under pressure depletion and CO2 injection conditions. A significant improvement has been observed in predicting the asphaltene precipitation data under gas injection conditions. Especially for the maximum value of asphaltene precipitation and for the trend of the curve after the peak point, good agreement was observed. For gas injection conditions, comparison of the thermodynamic micellization model and the improved solid model showed that the thermodynamic micellization model cannot predict the maximum of precipitation as well as the improved solid model. The non-isothermal improved solid model has been used for predicting asphaltene precipitation data under pressure depletion conditions. The pressure depletion tests were done at different levels of temperature and pressure, and the parameters of a non-isothermal model were tuned using three onset pressures at three different temperatures for the considered crude. The results showed that the model is highly sensitive to the amount of solid molar volume along with the interaction coefficient parameter between the asphaltene component and light hydrocarbon components. Using a non-isothermal improved solid model, the asphaltene phase envelope was developed. It has been revealed that at high temperatures, an

  20. Air Quality Forecasts Using the NASA GEOS Model: A Unified Tool from Local to Global Scales

    Science.gov (United States)

    Knowland, E. Emma; Keller, Christoph; Nielsen, J. Eric; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Cook, Melanie; Liu, Junhua; hide

    2017-01-01

    We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (approximately 25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.

  1. Climate modeling - a tool for the assessment of the paleodistribution of source and reservoir rocks

    Energy Technology Data Exchange (ETDEWEB)

    Roscher, M.; Schneider, J.W. [Technische Univ. Bergakademie Freiberg (Germany). Inst. fuer Geologie; Berner, U. [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany). Referat Organische Geochemie/Kohlenwasserstoff-Forschung

    2008-10-23

    In an on-going project of BGR and TU Bergakademie Freiberg, numeric paleo-climate modeling is used as a tool for the assessment of the paleo-distribution of organic rich deposits as well as of reservoir rocks. This modeling approach is based on new ideas concerning the formation of the Pangea supercontinent. The new plate tectonic concept is supported by paleo- magnetic data as it fits the 95% confidence interval of published data. Six Permocarboniferous time slices (340, 320, 300, 290, 270, 255 Ma) were chosen within a first paleo-climate modeling approach as they represent the most important changes of the Late Paleozoic climate development. The digital maps have a resolution of 2.8 x 2.8 (T42), suitable for high-resolution climate modeling, using the PLASIM model. CO{sub 2} concentrations of the paleo-atmosphere and paleo-insolation values have been estimated by published methods. For the purpose of validation, quantitative model output, had to be transformed into qualitative parameters in order to be able to compare digital data with qualitative data of geologic indicators. The model output of surface temperatures and precipitation was therefore converted into climate zones. The reconstructed occurrences of geological indicators like aeolian sands, evaporites, reefs, coals, oil source rocks, tillites, phosphorites and cherts were then compared to the computed paleo-climate zones. Examples of the Permian Pangea show a very good agreement between model results and geological indicators. From the modeling approach we are able to identify climatic processes which lead to the deposition of hydrocarbon source and reservoir rocks. The regional assessment of such atmospheric processes may be used for the identification of the paleo-distribution of organic rich deposits or rock types suitable to form hydrocarbon reservoirs. (orig.)

  2. COMBINE*: An integrated opto-mechanical tool for laser performance modeling

    Science.gov (United States)

    Rehak, M.; Di Nicola, J. M.

    2015-02-01

    Accurate modeling of thermal, mechanical and optical processes is important for achieving reliable, high-performance high energy lasers such as those at the National Ignition Facility [1] (NIF). The need for this capability is even more critical for high average power, high repetition rate applications. Modeling the effects of stresses and temperature fields on optical properties allows for optimal design of optical components and more generally of the architecture of the laser system itself. Stresses change the indices of refractions and induce inhomogeneities and anisotropy. We present a modern, integrated analysis tool that efficiently produces reliable results that are used in our laser propagation tools such as VBL [5]. COMBINE is built on and supplants the existing legacy tools developed for the previous generations of lasers at LLNL but also uses commercially available mechanical finite element codes ANSYS or COMSOL (including computational fluid dynamics). The COMBINE code computes birefringence and wave front distortions due to mechanical stresses on lenses and slabs of arbitrary geometry. The stresses calculated typically originate from mounting support, vacuum load, gravity, heat absorption and/or attending cooling. Of particular importance are the depolarization and detuning effects of nonlinear crystals due to thermal loading. Results are given in the form of Jones matrices, depolarization maps and wave front distributions. An incremental evaluation of Jones matrices and ray propagation in a 3D mesh with a stress and temperature field is performed. Wavefront and depolarization maps are available at the optical aperture and at slices within the optical element. The suite is validated, user friendly, supported, documented and amenable to collaborative development. * COMBINE stands for Code for Opto-Mechanical Birefringence Integrated Numerical Evaluations.

  3. Modeling decision making as a support tool for policy making on renewable energy development

    International Nuclear Information System (INIS)

    Cannemi, Marco; García-Melón, Mónica; Aragonés-Beltrán, Pablo; Gómez-Navarro, Tomás

    2014-01-01

    This paper presents the findings of a study on decision making models for the analysis of capital-risk investors’ preferences on biomass power plants projects. The aim of the work is to improve the support tools for policy makers in the field of renewable energy development. Analytic Network Process (ANP) helps to better understand capital-risk investors preferences towards different kinds of biomass fueled power plants. The results of the research allow public administration to better foresee the investors’ reaction to the incentive system, or to modify the incentive system to better drive investors’ decisions. Changing the incentive system is seen as major risk by investors. Therefore, public administration must design better and longer-term incentive systems, forecasting market reactions. For that, two scenarios have been designed, one showing a typical decision making process and another proposing an improved decision making scenario. A case study conducted in Italy has revealed that ANP allows understanding how capital-risk investors interpret the situation and make decisions when investing on biomass power plants; the differences between the interests of public administrations’s and promoters’, how decision making could be influenced by adding new decision criteria, and which case would be ranked best according to the decision models. - Highlights: • We applied ANP to the investors’ preferences on biomass power plants projects. • The aim is to improve the advising tools for renewable energy policy making. • A case study has been carried out with the help of two experts. • We designed two scenarios: decision making as it is and how could it be improved. • Results prove ANP is a fruitful tool enhancing participation and transparency

  4. Operation room tool handling and miscommunication scenarios: an object-process methodology conceptual model.

    Science.gov (United States)

    Wachs, Juan P; Frenkel, Boaz; Dori, Dov

    2014-11-01

    Errors in the delivery of medical care are the principal cause of inpatient mortality and morbidity, accounting for around 98,000 deaths in the United States of America (USA) annually. Ineffective team communication, especially in the operation room (OR), is a major root of these errors. This miscommunication can be reduced by analyzing and constructing a conceptual model of communication and miscommunication in the OR. We introduce the principles underlying Object-Process Methodology (OPM)-based modeling of the intricate interactions between the surgeon and the surgical technician while handling surgical instruments in the OR. This model is a software- and hardware-independent description of the agents engaged in communication events, their physical activities, and their interactions. The model enables assessing whether the task-related objectives of the surgical procedure were achieved and completed successfully and what errors can occur during the communication. The facts used to construct the model were gathered from observations of various types of operations miscommunications in the operating room and its outcomes. The model takes advantage of the compact ontology of OPM, which is comprised of stateful objects - things that exist physically or informatically, and processes - things that transform objects by creating them, consuming them or changing their state. The modeled communication modalities are verbal and non-verbal, and errors are modeled as processes that deviate from the "sunny day" scenario. Using OPM refinement mechanism of in-zooming, key processes are drilled into and elaborated, along with the objects that are required as agents or instruments, or objects that these processes transform. The model was developed through an iterative process of observation, modeling, group discussions, and simplification. The model faithfully represents the processes related to tool handling that take place in an OR during an operation. The specification is at

  5. Nongeneric tool support for model-driven product development; Werkzeugunterstuetzung fuer die modellbasierte Produktentwicklung. Maschinenlesbare Spezifikationen selbst erstellen

    Energy Technology Data Exchange (ETDEWEB)

    Bock, C. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Zuehlke, D. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Deutsches Forschungszentrum fuer Kuenstliche Intelligenz (DFKI), Kaiserslautern (DE). Zentrum fuer Mensch-Maschine-Interaktion (ZMMI)

    2006-07-15

    A well-defined specification process is a central success factor in human-machine-interface development. Consequently in interdisciplinary development teams specification documents are an important communication instrument. In order to replace todays typically paper-based specification and to leverage the benefits of their electronic equivalents developers demand comprehensive and applicable computer-based tool kits. Manufacturers' increasing awareness of appropriate tool support causes alternative approaches for tool kit creation to emerge. Therefore this article introduces meta-modelling as a promising attempt to create nongeneric tool support with justifiable effort. This enables manufacturers to take advantage of electronic specifications in product development processes.

  6. Modelling tools for integrating geological, geophysical and contamination data for characterization of groundwater plumes

    DEFF Research Database (Denmark)

    Balbarini, Nicola

    Contaminated sites are a major issue threatening the environment and the human health. The large number of contaminated sites require cost effective investigations to perform risk assessment and prioritize the sites that need remediation. Contaminated soil and groundwater investigations rely...... on borehole investigations to collect the geological, hydrological, and contaminant data. These data are integrated in conceptual and mathematical models describing the lithology, the groundwater flow, and the distribution of contaminant concentrations. Models are needed to analyze the potential risks to all...... receptors, including streams. Key risk assessment parameters, such as contaminant mass discharge estimates, and tools are then used to evaluate the risk. The cost of drilling often makes investigations of large and/or deep contaminant plumes unfeasible. For this reason, it is important to develop cost...

  7. Configurational entropy as a tool to select a physical thick brane model

    Science.gov (United States)

    Chinaglia, M.; Cruz, W. T.; Correa, R. A. C.; de Paula, W.; Moraes, P. H. R. S.

    2018-04-01

    We analize braneworld scenarios via a configurational entropy (CE) formalism. Braneworld scenarios have drawn attention mainly due to the fact that they can explain the hierarchy problem and unify the fundamental forces through a symmetry breaking procedure. Those scenarios localize matter in a (3 + 1) hypersurface, the brane, which is inserted in a higher dimensional space, the bulk. Novel analytical braneworld models, in which the warp factor depends on a free parameter n, were recently released in the literature. In this article we will provide a way to constrain this parameter through the relation between information and dynamics of a system described by the CE. We demonstrate that in some cases the CE is an important tool in order to provide the most probable physical system among all the possibilities. In addition, we show that the highest CE is correlated to a tachyonic sector of the configuration, where the solutions for the corresponding model are dynamically unstable.

  8. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  9. Methodology for the National Water Savings Model and Spreadsheet Tool Commercial/Institutional

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Long, Tim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Alison [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Melody, Moya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-01-01

    Lawrence Berkeley National Laboratory (LBNL) has developed a mathematical model to quantify the water and monetary savings attributable to the United States Environmental Protection Agency’s (EPA’s) WaterSense labeling program for commercial and institutional products. The National Water Savings–Commercial/Institutional (NWS-CI) model is a spreadsheet tool with which the EPA can evaluate the success of its program for encouraging buyers in the commercial and institutional (CI) sectors to purchase more water-efficient products. WaterSense has begun by focusing on three water-using products commonly used in the CI sectors: flushometer valve toilets, urinals, and pre-rinse spray valves. To estimate the savings attributable to WaterSense for each of the three products, LBNL applies an accounting method to national product shipments and lifetimes to estimate the shipments of each product.

  10. A Prospective Validation Study of a Rainbow Model of Integrated Care Measurement Tool in Singapore.

    Science.gov (United States)

    Nurjono, Milawaty; Valentijn, Pim P; Bautista, Mary Ann C; Wei, Lim Yee; Vrijhoef, Hubertus Johannes Maria

    2016-04-08

    The conceptual ambiguity of the integrated care concept precludes a full understanding of what constitutes a well-integrated health system, posing a significant challenge in measuring the level of integrated care. Most available measures have been developed from a disease-specific perspective and only measure certain aspects of integrated care. Based on the Rainbow Model of Integrated Care, which provides a detailed description of the complex concept of integrated care, a measurement tool has been developed to assess integrated care within a care system as a whole gathered from healthcare providers' and managerial perspectives. This paper describes the methodology of a study seeking to validate the Rainbow Model of Integrated Care measurement tool within and across the Singapore Regional Health System. The Singapore Regional Health System is a recent national strategy developed to provide a better-integrated health system to deliver seamless and person-focused care to patients through a network of providers within a specified geographical region. The validation process includes the assessment of the content of the measure and its psychometric properties. If the measure is deemed to be valid, the study will provide the first opportunity to measure integrated care within Singapore Regional Health System with the results allowing insights in making recommendations for improving the Regional Health System and supporting international comparison.

  11. Field modeling for transcranial magnetic stimulation: A useful tool to understand the physiological effects of TMS?

    Science.gov (United States)

    Thielscher, Axel; Antunes, Andre; Saturnino, Guilherme B

    2015-01-01

    Electric field calculations based on numerical methods and increasingly realistic head models are more and more used in research on Transcranial Magnetic Stimulation (TMS). However, they are still far from being established as standard tools for the planning and analysis in practical applications of TMS. Here, we start by delineating three main challenges that need to be addressed to unravel their full potential. This comprises (i) identifying and dealing with the model uncertainties, (ii) establishing a clear link between the induced fields and the physiological stimulation effects, and (iii) improving the usability of the tools for field calculation to the level that they can be easily used by non-experts. We then introduce a new version of our pipeline for field calculations (www.simnibs.org) that substantially simplifies setting up and running TMS and tDCS simulations based on Finite-Element Methods (FEM). We conclude with a brief outlook on how the new version of SimNIBS can help to target the above identified challenges.

  12. An Interactive Tool for Automatic Predimensioning and Numerical Modeling of Arch Dams

    Directory of Open Access Journals (Sweden)

    D. J. Vicente

    2017-01-01

    Full Text Available The construction of double-curvature arch dams is an attractive solution from an economic viewpoint due to the reduced volume of concrete necessary for their construction as compared to conventional gravity dams. Due to their complex geometry, many criteria have arisen for their design. However, the most widespread methods are based on recommendations of traditional technical documents without taking into account the possibilities of computer-aided design. In this paper, an innovative software tool to design FEM models of double-curvature arch dams is presented. Several capabilities are allowed: simplified geometry creation (interesting for academic purposes, preliminary geometrical design, high-detailed model construction, and stochastic calculation performance (introducing uncertainty associated with material properties and other parameters. This paper specially focuses on geometrical issues describing the functionalities of the tool and the fundamentals of the design procedure with regard to the following aspects: topography, reference cylinder, excavation depth, crown cantilever thickness and curvature, horizontal arch curvature, excavation and concrete mass volume, and additional elements such as joints or spillways. Examples of application on two Spanish dams are presented and the results obtained analyzed.

  13. In search of highly effective modeling tools for the CAD of photonic devices and components

    Science.gov (United States)

    Forastiere, Michele A.; Righini, Giancarlo C.; Bellanca, Gaetano; Tartarini, Giovanni; Bassi, Paolo

    2001-11-01

    The search for a reliable, low-cost, general-application modeling tool has been assuming a growing importance among integrated optics theoreticians. For example, finite-difference (FD) based algorithms have given rise to commercial photonic CAD software programs that are less expensive, from both financial and computational points of view, than finite-elements (FEM) based ones. On the other hand, the former show some computational drawbacks that do not permit to consider them as of truly general application, while the latter provide extremely reliable modeling tools. Recently, a few numerical techniques (alternative to both FD and FEM methods) have been proposed, mainly in view of an improvement in flexibility and a reduction in computational cost. In particular, methods based on the Galerkin approach and Krylov reduction have proven particularly effective for the solution of the Helmholtz equation in a very wide class of integrated optical structures. Moreover, these methods are very promising from the point of view of reliability and are computationally non-expensive. Here, we present the implementation of one of such numerical techniques, the so-called Arnoldi-Galerkin method, together with that of a home-made FEM software program. A comparison with the results from other algorithms is shown as well.

  14. Literature review report on atomistic modeling tools for FeCrAl alloys

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, Daniel [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martinez, Enrique [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-12-01

    This reports summarizes the literature review results on atomistic tools, particularly interatomic potentials used in molecular dynamics simulations, for FeCrAl ternary alloys. FeCrAl has recently been identified as a possible cladding concept for accident tolerant fuels for its superior corrosion resistance. Along with several other concepts, an initial evaluation and recommendation are desired for FeCrAl before it’s used in realistic fuels. For this purpose, sufficient understanding on the in-reactor behavior of FeCrAl needs to be grained in a relatively short timeframe, and multiscale modeling and simulations have been selected as an efficient measure to supplement experiments and in-reactor testing for better understanding on FeCrAl. For the limited knowledge on FeCrAl alloys, the multiscale modeling approach relies on atomistic simulations to obtain the missing material parameters and properties. As a first step, atomistic tools have to be identified and this is the purpose of the present report. It was noticed during the literature survey that no interatomic potentials currently available for FeCrAl. Here, we summarize the interatomic potentials available for FeCr alloys for possible molecular dynamics studies using FeCr as surrogate materials. Other atomistic methods such as lattice kinetic Monte Carlo are also included in this report. A couple of research topics at the atomic scale are suggested based on the literature survey.

  15. Tools for beach health data management, data processing, and predictive model implementation

    Science.gov (United States)

    ,

    2013-01-01

    This fact sheet describes utilities created for management of recreational waters to provide efficient data management, data aggregation, and predictive modeling as well as a prototype geographic information system (GIS)-based tool for data visualization and summary. All of these utilities were developed to assist beach managers in making decisions to protect public health. The Environmental Data Discovery and Transformation (EnDDaT) Web service identifies, compiles, and sorts environmental data from a variety of sources that help to define climatic, hydrologic, and hydrodynamic characteristics including multiple data sources within the U.S. Geological Survey and the National Oceanic and Atmospheric Administration. The Great Lakes Beach Health Database (GLBH-DB) and Web application was designed to provide a flexible input, export, and storage platform for beach water quality and sanitary survey monitoring data to compliment beach monitoring programs within the Great Lakes. A real-time predictive modeling strategy was implemented by combining the capabilities of EnDDaT and the GLBH-DB for timely, automated prediction of beach water quality. The GIS-based tool was developed to map beaches based on their physical and biological characteristics, which was shared with multiple partners to provide concepts and information for future Web-accessible beach data outlets.

  16. ACORNS: a tool for the visualisation and modelling of atypical development.

    Science.gov (United States)

    Moore, D G; George, R

    2011-10-01

    Across many academic disciplines visualisation and notation systems are used for modelling data and developing theory, but in child development visual models are not widely used; yet researchers and students of developmental difficulties may benefit from a visualisation and notation system which can clearly map developmental outcomes and trajectories, and convey hypothesised dynamic causal pathways. Such a system may help understanding of existing accounts and be a tool for developing new theories. We first present criteria that need to be met in order to provide fully nuanced visualisations of development, and discuss strengths and weaknesses of the visualisation system proposed by Morton. Secondly, we present a tool we have designed to give more precise accounts of development while also being accessible, intuitive and visually appealing. We have called this an Accessible Cause-Outcome Representation and Notation System (ACORNS). This system provides a framework for clear mapping and modelling of developmental sequences, illustrating more precisely how functions change over time, how factors interact with the environment, and the absolute and relative nature of causal outcomes. We provide a new template, a set of rules for the appropriate use of boxes and arrows, and a set of visually accessible indicators that can be used to show more precisely relative rates, degrees and variance of functioning over different capacities at different time points. We have designed ACORNS to give a precise and clear visualisation of how development unfolds; allowing the representation of less 'static' and more transactional models of developmental difficulties. We hope ACORNS will help students, clinicians and theoreticians across disciplines to better represent nuances of debates, and be a seed for the development of new theory. © 2011 The Authors. Journal of Intellectual Disability Research © 2011 Blackwell Publishing Ltd.

  17. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  18. Gbm.auto: A software tool to simplify spatial modelling and Marine Protected Area planning.

    Directory of Open Access Journals (Sweden)

    Simon Dedman

    Full Text Available Marine resource managers and scientists often advocate spatial approaches to manage data-poor species. Existing spatial prediction and management techniques are either insufficiently robust, struggle with sparse input data, or make suboptimal use of multiple explanatory variables. Boosted Regression Trees feature excellent performance and are well suited to modelling the distribution of data-limited species, but are extremely complicated and time-consuming to learn and use, hindering access for a wide potential user base and therefore limiting uptake and usage.We have built a software suite in R which integrates pre-existing functions with new tailor-made functions to automate the processing and predictive mapping of species abundance data: by automating and greatly simplifying Boosted Regression Tree spatial modelling, the gbm.auto R package suite makes this powerful statistical modelling technique more accessible to potential users in the ecological and modelling communities. The package and its documentation allow the user to generate maps of predicted abundance, visualise the representativeness of those abundance maps and to plot the relative influence of explanatory variables and their relationship to the response variables. Databases of the processed model objects and a report explaining all the steps taken within the model are also generated. The package includes a previously unavailable Decision Support Tool which combines estimated escapement biomass (the percentage of an exploited population which must be retained each year to conserve it with the predicted abundance maps to generate maps showing the location and size of habitat that should be protected to conserve the target stocks (candidate MPAs, based on stakeholder priorities, such as the minimisation of fishing effort displacement.By bridging the gap between advanced statistical methods for species distribution modelling and conservation science, management and policy, these

  19. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-01-01

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  20. Educational tool for modeling and simulation of a closed regenerative life support system

    Science.gov (United States)

    Arai, Tatsuya; Fanchiang, Christine; Aoki, Hirofumi; Newman, Dava J.

    For long term missions on the moon and Mars, regenerative life support systems emerge as a promising key technology for sustaining successful explorations with reduced re-supply logistics and cost. The purpose of this study was to create a simple model of a regenerative life support system which allows preliminary investigation of system responses. A simplified regenerative life support system was made with MATLAB Simulink ™. Mass flows in the system were simplified to carbon, water, oxygen and carbon dioxide. The subsystems included crew members, animals, a plant module, and a waste processor, which exchanged mass into and out of mass reservoirs. Preliminary numerical simulations were carried out to observe system responses. The simplified life support system model allowed preliminary investigation of the system response to perturbations such as increased or decreased number of crew members. The model is simple and flexible enough to add new components, and also possible to numerically predict non-linear subsystem functions and responses. Future work includes practical issues such as energy efficiency, air leakage, nutrition, and plant growth modeling. The model functions as an effective teaching tool about how a regenerative advanced life support system works.