WorldWideScience

Sample records for integrated modeling tool

  1. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    by proposing potential subsequent design issues. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, these decisions are typically not connected to the models created during...... integration of formerly disconnected tools improves tool usability as well as decision maker productivity....

  2. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  3. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  4. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  5. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Hahmann, Andrea N.; Nielsen, T. S.

    This poster describes the status as of April 2012 of the Public Service Obligation (PSO) funded project PSO 10464 \\Integrated Wind Power Planning Tool". The project goal is to integrate a meso scale numerical weather prediction (NWP) model with a statistical tool in order to better predict short...... term power variation from off shore wind farms, as well as to conduct forecast error assessment studies in preparation for later implementation of such a feature in an existing simulation model. The addition of a forecast error estimation feature will further increase the value of this tool, as it...

  6. Extending the Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2016-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…

  7. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  8. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Giebel, Gregor; Nielsen, T. S.

    2012-01-01

    model to be developed in collaboration with ENFOR A/S; a danish company that specialises in forecasting and optimisation for the energy sector. This integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting......This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the working title "Integrated Wind Power Planning Tool". The project commenced October 1, 2011, and the goal is to integrate a numerical weather prediction (NWP) model with purely...

  9. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  10. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  11. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.

  12. A Prospective Validation Study of a Rainbow Model of Integrated Care Measurement Tool in Singapore.

    Science.gov (United States)

    Nurjono, Milawaty; Valentijn, Pim P; Bautista, Mary Ann C; Wei, Lim Yee; Vrijhoef, Hubertus Johannes Maria

    2016-04-08

    The conceptual ambiguity of the integrated care concept precludes a full understanding of what constitutes a well-integrated health system, posing a significant challenge in measuring the level of integrated care. Most available measures have been developed from a disease-specific perspective and only measure certain aspects of integrated care. Based on the Rainbow Model of Integrated Care, which provides a detailed description of the complex concept of integrated care, a measurement tool has been developed to assess integrated care within a care system as a whole gathered from healthcare providers' and managerial perspectives. This paper describes the methodology of a study seeking to validate the Rainbow Model of Integrated Care measurement tool within and across the Singapore Regional Health System. The Singapore Regional Health System is a recent national strategy developed to provide a better-integrated health system to deliver seamless and person-focused care to patients through a network of providers within a specified geographical region. The validation process includes the assessment of the content of the measure and its psychometric properties. If the measure is deemed to be valid, the study will provide the first opportunity to measure integrated care within Singapore Regional Health System with the results allowing insights in making recommendations for improving the Regional Health System and supporting international comparison.

  13. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    Science.gov (United States)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  14. The Overture Initiative Integrating Tools for VDM

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Battle, Nick; Ferreira, Miguel

    2010-01-01

    Overture is a community-based initiative that aims to develop a common open-source platform integrating a range of tools for constructing and analysing formal models of systems using VDM. The mission is to both provide an industrial-strength tool set for VDM and also to provide an environment...

  15. Activity-Centred Tool Integration

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius

    2003-01-01

    This paper is concerned with integration of heterogeneous tools for system development. We argue that such tools should support concrete activities (e.g., programming, unit testing, conducting workshops) in contrast to abstract concerns (e.g., analysis, design, implementation). A consequence of t...... of this is that tools — or components —that support activities well should be integrated in ad-hoc, dynamic, and heterogeneous ways. We present a peer-to-peer architecture for this based on type-based publish subscribe and give an example of its use....

  16. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  17. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    , but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...... of formerly disconnected tools could improve tool usability as well as decision maker productivity....

  18. WINS. Market Simulation Tool for Facilitating Wind Energy Integration

    Energy Technology Data Exchange (ETDEWEB)

    Shahidehpour, Mohammad [Illinois Inst. of Technology, Chicago, IL (United States)

    2012-10-30

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practices can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision

  19. Gsflow-py: An integrated hydrologic model development tool

    Science.gov (United States)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  20. Laboratory informatics tools integration strategies for drug discovery: integration of LIMS, ELN, CDS, and SDMS.

    Science.gov (United States)

    Machina, Hari K; Wild, David J

    2013-04-01

    There are technologies on the horizon that could dramatically change how informatics organizations design, develop, deliver, and support applications and data infrastructures to deliver maximum value to drug discovery organizations. Effective integration of data and laboratory informatics tools promises the ability of organizations to make better informed decisions about resource allocation during the drug discovery and development process and for more informed decisions to be made with respect to the market opportunity for compounds. We propose in this article a new integration model called ELN-centric laboratory informatics tools integration.

  1. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  2. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  3. Examination of the low frequency limit for helicopter noise data in the Federal Aviation Administration's Aviation Environmental Design Tool and Integrated Noise Model

    Science.gov (United States)

    2010-04-19

    The Federal Aviation Administration (FAA) aircraft noise modeling tools Aviation Environmental Design Tool (AEDTc) and Integrated Noise Model (INM) do not currently consider noise below 50 Hz in their computations. This paper describes a preliminary ...

  4. Integrated Control Modeling for Propulsion Systems Using NPSS

    Science.gov (United States)

    Parker, Khary I.; Felder, James L.; Lavelle, Thomas M.; Withrow, Colleen A.; Yu, Albert Y.; Lehmann, William V. A.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS), an advanced engineering simulation environment used to design and analyze aircraft engines, has been enhanced by integrating control development tools into it. One of these tools is a generic controller interface that allows NPSS to communicate with control development software environments such as MATLAB and EASY5. The other tool is a linear model generator (LMG) that gives NPSS the ability to generate linear, time-invariant state-space models. Integrating these tools into NPSS enables it to be used for control system development. This paper will discuss the development and integration of these tools into NPSS. In addition, it will show a comparison of transient model results of a generic, dual-spool, military-type engine model that has been implemented in NPSS and Simulink. It will also show the linear model generator s ability to approximate the dynamics of a nonlinear NPSS engine model.

  5. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna; Tramontano, Anna; Marcatili, Paolo

    2011-01-01

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  6. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna

    2011-11-10

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  7. An integrated development environment for PMESII model authoring, integration, validation, and debugging

    Science.gov (United States)

    Pioch, Nicholas J.; Lofdahl, Corey; Sao Pedro, Michael; Krikeles, Basil; Morley, Liam

    2007-04-01

    To foster shared battlespace awareness in Air Operations Centers supporting the Joint Forces Commander and Joint Force Air Component Commander, BAE Systems is developing a Commander's Model Integration and Simulation Toolkit (CMIST), an Integrated Development Environment (IDE) for model authoring, integration, validation, and debugging. CMIST is built on the versatile Eclipse framework, a widely used open development platform comprised of extensible frameworks that enable development of tools for building, deploying, and managing software. CMIST provides two distinct layers: 1) a Commander's IDE for supporting staff to author models spanning the Political, Military, Economic, Social, Infrastructure, Information (PMESII) taxonomy; integrate multiple native (third-party) models; validate model interfaces and outputs; and debug the integrated models via intuitive controls and time series visualization, and 2) a PMESII IDE for modeling and simulation developers to rapidly incorporate new native simulation tools and models to make them available for use in the Commander's IDE. The PMESII IDE provides shared ontologies and repositories for world state, modeling concepts, and native tool characterization. CMIST includes extensible libraries for 1) reusable data transforms for semantic alignment of native data with the shared ontology, and 2) interaction patterns to synchronize multiple native simulations with disparate modeling paradigms, such as continuous-time system dynamics, agent-based discrete event simulation, and aggregate solution methods such as Monte Carlo sampling over dynamic Bayesian networks. This paper describes the CMIST system architecture, our technical approach to addressing these semantic alignment and synchronization problems, and initial results from integrating Political-Military-Economic models of post-war Iraq spanning multiple modeling paradigms.

  8. A tool to guide the process of integrating health system responses to public health problems

    Directory of Open Access Journals (Sweden)

    Tilahun Nigatu Haregu

    2015-06-01

    Full Text Available An integrated model of health system responses to public health problems is considered to be the most preferable approach. Accordingly, there are several models that stipulate what an integrated architecture should look like. However, tools that can guide the overall process of integration are lacking. This tool is designed to guide the entire process of integration of health system responses to major public health problems. It is developed by taking into account the contexts of health systems of developing countries and the emergence of double-burden of chronic diseases in these settings. Chronic diseases – HIV/AIDS and NCDs – represented the evidence base for the development of the model. System level horizontal integration of health system responses were considered in the development of this tool.

  9. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  10. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  11. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  12. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  13. Basic data, computer codes and integral experiments: The tools for modelling in nuclear technology

    International Nuclear Information System (INIS)

    Sartori, E.

    2001-01-01

    When studying applications in nuclear technology we need to understand and be able to predict the behavior of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are: - The basic nuclear or chemical data; - The computer codes, and; - The integral experiments. This article describes the role each component plays in a computational scheme designed for modelling purposes. It describes also which tools have been developed and are internationally available. The role of the OECD/NEA Data Bank, the Radiation Shielding Information Computational Center (RSICC), and the IAEA Nuclear Data Section are playing in making these elements available to the community of scientists and engineers is described. (author)

  14. A model of integration among prediction tools: applied study to road freight transportation

    Directory of Open Access Journals (Sweden)

    Henrique Dias Blois

    Full Text Available Abstract This study has developed a scenery analysis model which has integrated decision-making tools on investments: prospective scenarios (Grumbach Method and systems dynamics (hard modeling, with the innovated multivariate analysis of experts. It was designed through analysis and simulation scenarios and showed which are the most striking events in the study object as well as highlighted the actions could redirect the future of the analyzed system. Moreover, predictions are likely to be developed through the generated scenarios. The model has been validated empirically with road freight transport data from state of Rio Grande do Sul, Brazil. The results showed that the model contributes to the analysis of investment because it identifies probabilities of events that impact on decision making, and identifies priorities for action, reducing uncertainties in the future. Moreover, it allows an interdisciplinary discussion that correlates different areas of knowledge, fundamental when you wish more consistency in creating scenarios.

  15. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  16. An architecture for integration of multidisciplinary models

    DEFF Research Database (Denmark)

    Belete, Getachew F.; Voinov, Alexey; Holst, Niels

    2014-01-01

    Integrating multidisciplinary models requires linking models: that may operate at different temporal and spatial scales; developed using different methodologies, tools and techniques; different levels of complexity; calibrated for different ranges of inputs and outputs, etc. On the other hand......, Enterprise Application Integration, and Integration Design Patterns. We developed an architecture of a multidisciplinary model integration framework that brings these three aspects of integration together. Service-oriented-based platform independent architecture that enables to establish loosely coupled...

  17. Knowledge Management tools integration within DLR's concurrent engineering facility

    Science.gov (United States)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  18. The systems integration operations/logistics model as a decision-support tool

    International Nuclear Information System (INIS)

    Miller, C.; Vogel, L.W.; Joy, D.S.

    1989-01-01

    Congress has enacted legislation specifying Yucca Mountain, Nevada, for characterization as the candidate site for the disposal of spent fuel and high-level wastes and has authorized a monitored retrievable storage (MRS) facility if one is warranted. Nevertheless, the exact configuration of the facilities making up the Federal Waste Management System (FWMS) was not specified. This has left the Office of Civilian Radioactive Waste Management (OCRWM) the responsibility for assuring the design of a safe and reliable disposal system. In order to assist in the analysis of potential configuration alternatives, operating strategies, and other factors for the FWMS and its various elements, a decision-support tool known as the systems integration operations/logistics model (SOLMOD) was developed. SOLMOD is a discrete event simulation model that emulates the movement and interaction of equipment and radioactive waste as it is processed through the FWMS - from pickup at reactor pools to emplacement. The model can be used to measure the impacts of different operating schedules and rules, system configurations, and equipment and other resource availabilities on the performance of processes comprising the FWMS and how these factors combine to determine overall system performance. SOLMOD can assist in identifying bottlenecks and can be used to assess capacity utilization of specific equipment and staff as well as overall system resilience

  19. Integrated catchment modelling in a Semi-arid area

    CSIR Research Space (South Africa)

    Bugan, Richard DH

    2010-09-01

    Full Text Available , will increasingly need water quality and quantity management tools to be able to make informed decisions. Integrated catchment modelling (ICM) is regarded as being a valuable tool for integrated water resource management. It enables officials and scientists to make...

  20. Critical chain project management and drum-buffer-rope tools integration in construction industry - case study

    Directory of Open Access Journals (Sweden)

    Piotr Cyplik

    2012-03-01

    Full Text Available Background: The concept of integrating the theory of constraints tools in reorganizing management system in a mechanical engineering company was presented in this article. The main aim of the concept is to enable the enterprise to satisfy the customers' expectations at reasonable costs, which allows for making a profit and creating an agile enterprise in the long run. Methods: Due to the individual character of the production process and service process in analyzed company, the described concept using theory of constraints project management (CCPM and manufacturing (DBR tools. The authors use performance levels conception to build an integration tool focused on the interaction and collaboration between different departments. The integration tool has been developed and verified in Polish manufacturing company. Results: In described model a tool compatible with CCPM operates on the level of the customer service process. Shop floor is controlled based on the DBR method. The authors hold that the integration of between TOC tools is of key importance. The integration of TOC tools dedicated to managing customer service and shop floor scheduling and controlling requires developing a mechanism for repeated transmitting the information between them. This mechanism has been developed. Conclusions: The conducted research showed that the developed tool integrating CCPM and DBR had a positive impact on the enterprise performance. It enables improving the company performance in meeting target group requirements by focusing on enhancing the efficiency of processes running in the company and tasks processed at particular work stations. The described model has been successfully implemented in one of the Polish mechanical engineering companies.

  1. INTEGRATION OF COST MODELS AND PROCESS SIMULATION TOOLS FOR OPTIMUM COMPOSITE MANUFACTURING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Pack, Seongchan [General Motors; Wilson, Daniel [General Motors; Aitharaju, Venkat [General Motors; Kia, Hamid [General Motors; Yu, Hang [ESI, Group.; Doroudian, Mark [ESI Group

    2017-09-05

    Manufacturing cost of resin transfer molded composite parts is significantly influenced by the cycle time, which is strongly related to the time for both filling and curing of the resin in the mold. The time for filling can be optimized by various injection strategies, and by suitably reducing the length of the resin flow distance during the injection. The curing time can be reduced by the usage of faster curing resins, but it requires a high pressure injection equipment, which is capital intensive. Predictive manufacturing simulation tools that are being developed recently for composite materials are able to provide various scenarios of processing conditions virtually well in advance of manufacturing the parts. In the present study, we integrate the cost models with process simulation tools to study the influence of various parameters such as injection strategies, injection pressure, compression control to minimize high pressure injection, resin curing rate, and demold time on the manufacturing cost as affected by the annual part volume. A representative automotive component was selected for the study and the results are presented in this paper

  2. The Integrated Medical Model: A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    Science.gov (United States)

    Kerstman, Eric L.; Minard, Charles; FreiredeCarvalho, Mary H.; Walton, Marlei E.; Myers, Jerry G., Jr.; Saile, Lynn G.; Lopez, Vilma; Butler, Douglas J.; Johnson-Throop, Kathy A.

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) and its use as a risk assessment and decision support tool for human space flight missions. The IMM is an integrated, quantified, evidence-based decision support tool useful to NASA crew health and mission planners. It is intended to assist in optimizing crew health, safety and mission success within the constraints of the space flight environment for in-flight operations. It uses ISS data to assist in planning for the Exploration Program and it is not intended to assist in post flight research. The IMM was used to update Probability Risk Assessment (PRA) for the purpose of updating forecasts for the conditions requiring evacuation (EVAC) or Loss of Crew Life (LOC) for the ISS. The IMM validation approach includes comparison with actual events and involves both qualitative and quantitaive approaches. The results of these comparisons are reviewed. Another use of the IMM is to optimize the medical kits taking into consideration the specific mission and the crew profile. An example of the use of the IMM to optimize the medical kits is reviewed.

  3. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  4. Evaluation of Oracle Big Data Integration Tools

    OpenAIRE

    Urhan, Harun; Baranowski, Zbigniew

    2015-01-01

    Abstract The project’s objective is evaluating Oracle’s Big Data Integration Tools. The project covers evaluation of two of Oracle’s tools, Oracle Data Integrator: Application Adapters for Hadoop to load data from Oracle Database to Hadoop and Oracle SQL Connectors for HDFS to query data stored on a Hadoop file system by using SQL statements executed on an Oracle Database.

  5. VLM Tool for IDS Integration

    Directory of Open Access Journals (Sweden)

    Cǎtǎlin NAE

    2010-03-01

    Full Text Available This paper is dedicated to a very specific type of analysis tool (VLM - Vortex Lattice Method to be integrated in a IDS - Integrated Design System, tailored for the usage of small aircraft industry. The major interest is to have the possibility to simulate at very low computational costs a preliminary set of aerodynamic characteristics for basic aerodynamic global characteristics (Lift, Drag, Pitching Moment and aerodynamic derivatives for longitudinal and lateral-directional stability analysis. This work enables fast investigations of the influence of configuration changes in a very efficient computational environment. Using experimental data and/or CFD information for a specific calibration of VLM method, reliability of the analysis may me increased so that a first type (iteration zero aerodynamic evaluation of the preliminary 3D configuration is possible. The output of this tool is basic state aerodynamic and associated stability and control derivatives, as well as a complete set of information on specific loads on major airframe components.The major interest in using and validating this type of methods is coming from the possibility to integrate it as a tool in an IDS system for conceptual design phase, as considered for development for CESAR project (IP, UE FP6.

  6. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    Science.gov (United States)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision

  7. An Integrated Simulation Tool for Modeling the Human Circulatory System

    Science.gov (United States)

    Asami, Ken'ichi; Kitamura, Tadashi

    This paper presents an integrated simulation of the circulatory system in physiological movement. The large circulatory system model includes principal organs and functional units in modules in which comprehensive physiological changes such as nerve reflexes, temperature regulation, acid/base balance, O2/CO2 balance, and exercise are simulated. A beat-by-beat heart model, in which the corresponding electrical circuit problems are solved by a numerical analytic method, enables calculation of pulsatile blood flow to the major organs. The integration of different perspectives on physiological changes makes this simulation model applicable for the microscopic evaluation of blood flow under various conditions in the human body.

  8. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  9. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  10. Bond graphs : an integrating tool for design of mechatronic systems

    International Nuclear Information System (INIS)

    Ould Bouamama, B.

    2011-01-01

    Bond graph is a powerful tool well known for dynamic modelling of multi physical systems: This is the only modelling technique to generate automatically state space or non-linear models using dedicated software tools (CAMP-G, 20-Sim, Symbols, Dymola...). Recently several fundamental theories have been developed for using a bond graph model not only for modeling but also as a real integrated tool from conceptual ideas to optimal practical realization of mechatronic system. This keynote presents a synthesis of those new theories which exploit some particular properties (such as causal, structural and behavioral) of this graphical methodology. Based on a pedagogical example, it will be shown how from a physical system (not a transfer function or state equation) and using only one representation (Bond graph), the following results can be performed: modeling (formal state equations generation), Control analysis (observability, controllability, Structural I/O decouplability, dynamic decoupling,...) diagnosis analysis (automatic generation of robust fault indicators, sensor placement, structural diagnosability) and finally sizing of actuators. The presentation will be illustrated by real industrial applications. Limits and perspectives of bond graph theory conclude the keynote.

  11. Tool Integration: Experiences and Issues in Using XMI and Component Technology

    DEFF Research Database (Denmark)

    Damm, Christian Heide; Hansen, Klaus Marius; Thomsen, Michael

    2000-01-01

    of conflicting data models, and provide architecture for doing so, based on component technology and XML Metadata Interchange. As an example, we discuss the implementation of an electronic whiteboard tool, Knight, which adds support for creative and collaborative object-oriented modeling to existing Computer-Aided...... Software Engineering through integration using our proposed architecture....

  12. Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)

    Science.gov (United States)

    Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David

    2018-01-01

    Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of

  13. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    Directory of Open Access Journals (Sweden)

    Esther Suter

    2017-11-01

    Full Text Available Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework.

  14. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    Science.gov (United States)

    Oelke, Nelly D.; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana

    2017-01-01

    Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework. PMID:29588637

  15. An introduction to Space Weather Integrated Modeling

    Science.gov (United States)

    Zhong, D.; Feng, X.

    2012-12-01

    The need for a software toolkit that integrates space weather models and data is one of many challenges we are facing with when applying the models to space weather forecasting. To meet this challenge, we have developed Space Weather Integrated Modeling (SWIM) that is capable of analysis and visualizations of the results from a diverse set of space weather models. SWIM has a modular design and is written in Python, by using NumPy, matplotlib, and the Visualization ToolKit (VTK). SWIM provides data management module to read a variety of spacecraft data products and a specific data format of Solar-Interplanetary Conservation Element/Solution Element MHD model (SIP-CESE MHD model) for the study of solar-terrestrial phenomena. Data analysis, visualization and graphic user interface modules are also presented in a user-friendly way to run the integrated models and visualize the 2-D and 3-D data sets interactively. With these tools we can locally or remotely analysis the model result rapidly, such as extraction of data on specific location in time-sequence data sets, plotting interplanetary magnetic field lines, multi-slicing of solar wind speed, volume rendering of solar wind density, animation of time-sequence data sets, comparing between model result and observational data. To speed-up the analysis, an in-situ visualization interface is used to support visualizing the data 'on-the-fly'. We also modified some critical time-consuming analysis and visualization methods with the aid of GPU and multi-core CPU. We have used this tool to visualize the data of SIP-CESE MHD model in real time, and integrated the Database Model of shock arrival, Shock Propagation Model, Dst forecasting model and SIP-CESE MHD model developed by SIGMA Weather Group at State Key Laboratory of Space Weather/CAS.

  16. Data assimilation in integrated hydrological modelling

    DEFF Research Database (Denmark)

    Rasmussen, Jørn

    Integrated hydrological models are useful tools for water resource management and research, and advances in computational power and the advent of new observation types has resulted in the models generally becoming more complex and distributed. However, the models are often characterized by a high...... degree of parameterization which results in significant model uncertainty which cannot be reduced much due to observations often being scarce and often taking the form of point measurements. Data assimilation shows great promise for use in integrated hydrological models , as it allows for observations...... to be efficiently combined with models to improve model predictions, reduce uncertainty and estimate model parameters. In this thesis, a framework for assimilating multiple observation types and updating multiple components and parameters of a catchment scale integrated hydrological model is developed and tested...

  17. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis.

    Science.gov (United States)

    Lal, Aparna

    2016-02-02

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  18. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis

    Directory of Open Access Journals (Sweden)

    Aparna Lal

    2016-02-01

    Full Text Available Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  19. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  20. Validation of TGLF in C-Mod and DIII-D using machine learning and integrated modeling tools

    Science.gov (United States)

    Rodriguez-Fernandez, P.; White, Ae; Cao, Nm; Creely, Aj; Greenwald, Mj; Grierson, Ba; Howard, Nt; Meneghini, O.; Petty, Cc; Rice, Je; Sciortino, F.; Yuan, X.

    2017-10-01

    Predictive models for steady-state and perturbative transport are necessary to support burning plasma operations. A combination of machine learning algorithms and integrated modeling tools is used to validate TGLF in C-Mod and DIII-D. First, a new code suite, VITALS, is used to compare SAT1 and SAT0 models in C-Mod. VITALS exploits machine learning and optimization algorithms for the validation of transport codes. Unlike SAT0, the SAT1 saturation rule contains a model to capture cross-scale turbulence coupling. Results show that SAT1 agrees better with experiments, further confirming that multi-scale effects are needed to model heat transport in C-Mod L-modes. VITALS will next be used to analyze past data from DIII-D: L-mode ``Shortfall'' plasma and ECH swing experiments. A second code suite, PRIMA, allows for integrated modeling of the plasma response to Laser Blow-Off cold pulses. Preliminary results show that SAT1 qualitatively reproduces the propagation of cold pulses after LBO injections and SAT0 does not, indicating that cross-scale coupling effects play a role in the plasma response. PRIMA will be used to ``predict-first'' cold pulse experiments using the new LBO system at DIII-D, and analyze existing ECH heat pulse data. Work supported by DE-FC02-99ER54512, DE-FC02-04ER54698.

  1. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  2. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-01-01

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  3. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com [Amirkabir University of Technology, PhD Student at Reservoir Engineering, Department of Petroleum Engineering (Iran, Islamic Republic of); Khamehchi, Ehsan [Amirkabir University of Technology, Faculty of Petroleum Engineering (Iran, Islamic Republic of)

    2013-12-15

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  4. SIRSALE: integrated video database management tools

    Science.gov (United States)

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  5. Integrated Visualization Environment for Science Mission Modeling, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA is emphasizing the use of larger, more integrated models in conjunction with systems engineering tools and decision support systems. These tools place a...

  6. Integrating Risk Analyses and Tools at the DOE Hanford Site

    International Nuclear Information System (INIS)

    LOBER, R.W.

    2002-01-01

    Risk assessment and environmental impact analysis at the U.S. Department of Energy (DOE) Hanford Site in Washington State has made significant progress in refining the strategy for using risk analysis to support closing of several hundred waste sites plus 149 single-shell tanks at the Hanford Site. A Single-Shell Tank System Closure Work Plan outlines the current basis for closing the single-shell tank systems. An analogous site approach has been developed to address closure of aggregated groups of similar waste sites. Because of the complexity, decision time frames, proximity of non-tank farm waste sites to tank farms, scale, and regulatory considerations, various projects are providing integrated assessments to support risk analyses and decision-making. Projects and the tools that are being developed and applied at Hanford to support retrieval and cleanup decisions include: (1) Life Cycle Model (LCM) and Risk Receptor Model (RRM)--A site-level set of tools to support strategic analyses through scoping level risk management to assess different alternatives and options for tank closure. (2) Systems Assessment Capability for Integrated Groundwater Nadose Zone (SAC) and the Site-Wide Groundwater Model (SWGM)--A site-wide groundwater modeling system coupled with a risk-based uncertainty analysis of inventory, vadose zone, groundwater, and river interactions for evaluating cumulative impacts from individual and aggregate waste sites. (3) Retrieval Performance Evaluation (RPE)--A site-specific, risk-based methodology developed to evaluate performance of waste retrieval, leak detection and closure on a tank-specific basis as a function of past tank Leaks, potential leakage during retrieval operations, and remaining residual waste inventories following completion of retrieval operations. (4) Field Investigation Report (FIR)--A corrective action program to investigate the nature and extent of past tank leaks through characterization activities and assess future impacts to

  7. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  8. CaliBayes and BASIS: integrated tools for the calibration, simulation and storage of biological simulation models.

    Science.gov (United States)

    Chen, Yuhui; Lawless, Conor; Gillespie, Colin S; Wu, Jake; Boys, Richard J; Wilkinson, Darren J

    2010-05-01

    Dynamic simulation modelling of complex biological processes forms the backbone of systems biology. Discrete stochastic models are particularly appropriate for describing sub-cellular molecular interactions, especially when critical molecular species are thought to be present at low copy-numbers. For example, these stochastic effects play an important role in models of human ageing, where ageing results from the long-term accumulation of random damage at various biological scales. Unfortunately, realistic stochastic simulation of discrete biological processes is highly computationally intensive, requiring specialist hardware, and can benefit greatly from parallel and distributed approaches to computation and analysis. For these reasons, we have developed the BASIS system for the simulation and storage of stochastic SBML models together with associated simulation results. This system is exposed as a set of web services to allow users to incorporate its simulation tools into their workflows. Parameter inference for stochastic models is also difficult and computationally expensive. The CaliBayes system provides a set of web services (together with an R package for consuming these and formatting data) which addresses this problem for SBML models. It uses a sequential Bayesian MCMC method, which is powerful and flexible, providing very rich information. However this approach is exceptionally computationally intensive and requires the use of a carefully designed architecture. Again, these tools are exposed as web services to allow users to take advantage of this system. In this article, we describe these two systems and demonstrate their integrated use with an example workflow to estimate the parameters of a simple model of Saccharomyces cerevisiae growth on agar plates.

  9. Climbing the ladder: capability maturity model integration level 3

    Science.gov (United States)

    Day, Bryce; Lutteroth, Christof

    2011-02-01

    This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.

  10. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-06-17

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  11. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    International Nuclear Information System (INIS)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-01-01

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  12. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  13. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  14. Integrated Decision Tools for Sustainable Watershed/Ground Water and Crop Health using Predictive Weather, Remote Sensing, and Irrigation Decision Tools

    Science.gov (United States)

    Jones, A. S.; Andales, A.; McGovern, C.; Smith, G. E. B.; David, O.; Fletcher, S. J.

    2017-12-01

    US agricultural and Govt. lands have a unique co-dependent relationship, particularly in the Western US. More than 30% of all irrigated US agricultural output comes from lands sustained by the Ogallala Aquifer in the western Great Plains. Six US Forest Service National Grasslands reside within the aquifer region, consisting of over 375,000 ha (3,759 km2) of USFS managed lands. Likewise, National Forest lands are the headwaters to many intensive agricultural regions. Our Ogallala Aquifer team is enhancing crop irrigation decision tools with predictive weather and remote sensing data to better manage water for irrigated crops within these regions. An integrated multi-model software framework is used to link irrigation decision tools, resulting in positive management benefits on natural water resources. Teams and teams-of-teams can build upon these multi-disciplinary multi-faceted modeling capabilities. For example, the CSU Catalyst for Innovative Partnerships program has formed a new multidisciplinary team that will address "Rural Wealth Creation" focusing on the many integrated links between economic, agricultural production and management, natural resource availabilities, and key social aspects of govt. policy recommendations. By enhancing tools like these with predictive weather and other related data (like in situ measurements, hydrologic models, remotely sensed data sets, and (in the near future) linking to agro-economic and life cycle assessment models) this work demonstrates an integrated data-driven future vision of inter-meshed dynamic systems that can address challenging multi-system problems. We will present the present state of the work and opportunities for future involvement.

  15. Separations and safeguards model integration.

    Energy Technology Data Exchange (ETDEWEB)

    Cipiti, Benjamin B.; Zinaman, Owen

    2010-09-01

    Research and development of advanced reprocessing plant designs can greatly benefit from the development of a reprocessing plant model capable of transient solvent extraction chemistry. This type of model can be used to optimize the operations of a plant as well as the designs for safeguards, security, and safety. Previous work has integrated a transient solvent extraction simulation module, based on the Solvent Extraction Process Having Interaction Solutes (SEPHIS) code developed at Oak Ridge National Laboratory, with the Separations and Safeguards Performance Model (SSPM) developed at Sandia National Laboratory, as a first step toward creating a more versatile design and evaluation tool. The goal of this work was to strengthen the integration by linking more variables between the two codes. The results from this integrated model show expected operational performance through plant transients. Additionally, ORIGEN source term files were integrated into the SSPM to provide concentrations, radioactivity, neutron emission rate, and thermal power data for various spent fuels. This data was used to generate measurement blocks that can determine the radioactivity, neutron emission rate, or thermal power of any stream or vessel in the plant model. This work examined how the code could be expanded to integrate other separation steps and benchmark the results to other data. Recommendations for future work will be presented.

  16. Tools for integrating environmental objectives into policy and practice: What works where?

    Energy Technology Data Exchange (ETDEWEB)

    Runhaar, Hens

    2016-07-15

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering this question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.

  17. Tools for integrating environmental objectives into policy and practice: What works where?

    International Nuclear Information System (INIS)

    Runhaar, Hens

    2016-01-01

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering this question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.

  18. Modeling tools for an Integrated River-Delta-Sea system investigation: the Pan-European Research Infrastructure DANUBIUS-RI philosophy

    Science.gov (United States)

    Umgiesser, Georg; Bellafiore, Debora; De Pascalis, Francesca; Icke, Joost; Stanica, Adrian

    2017-04-01

    The DANUBIUS Research Infrastructure (DANUBIUS-RI) is a new initiative to address the challenges and opportunities of research on large river- sea (RS) systems. DANUBIUS-RI is a distributed pan-European RI that will provide a platform for interdisciplinary research. It will deal with RS investigation through facilities and expertise from a large number of European institutions becoming a 'one-stop shop' for knowledge exchange in managing RS systems, ranging from freshwater to marine research. Globally, RS systems are complex and dynamic, with huge environmental, social and economic value. They are poorly understood but under increasing pressure through pollution, hydraulic engineering, water supply, energy, flood control and erosion. RS systems in Europe are among the most impacted globally, after centuries of industrialisation, urbanisation and agricultural intensification. Improved understanding is essential to avoid irreversible degradation and for restoration. DANUBIUS-RI will provide, among a number of other facilities concerning observations, analyses, impacts' evaluation, a modeling node that will provide integrated up-to-date tools, at locations of high scientific importance and opportunity, covering the RS systems - from source (upper parts of rivers - mountain lakes) to the transition with coastal seas. Modeling will be one of the major services provided by DANUBIUS-RI, relying on the inputs from the whole RI. RS systems are challenging from a modelling point of view, because of the complex morphology and the wide temporal and spatial range of processes occurring. Scale interaction plays a central role, considering the different hydro-eco-morphological processes on the large (basin) and small (local, coast, rivers, lagoons) scale. Currently, different model applications are made for the different geographical domains, and also for subsets of the processes. For instance there are separate models for rainfall runoff in the catchment, a sewer model for the

  19. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  20. Integrating New Technologies and Existing Tools to Promote Programming Learning

    Directory of Open Access Journals (Sweden)

    Álvaro Santos

    2010-04-01

    Full Text Available In recent years, many tools have been proposed to reduce programming learning difficulties felt by many students. Our group has contributed to this effort through the development of several tools, such as VIP, SICAS, OOP-Anim, SICAS-COL and H-SICAS. Even though we had some positive results, the utilization of these tools doesn’t seem to significantly reduce weaker student’s difficulties. These students need stronger support to motivate them to get engaged in learning activities, inside and outside classroom. Nowadays, many technologies are available to create contexts that may help to accomplish this goal. We consider that a promising path goes through the integration of solutions. In this paper we analyze the features, strengths and weaknesses of the tools developed by our group. Based on these considerations we present a new environment, integrating different types of pedagogical approaches, resources, tools and technologies for programming learning support. With this environment, currently under development, it will be possible to review contents and lessons, based on video and screen captures. The support for collaborative tasks is another key point to improve and stimulate different models of teamwork. The platform will also allow the creation of various alternative models (learning objects for the same subject, enabling personalized learning paths adapted to each student knowledge level, needs and preferential learning styles. The learning sequences will work as a study organizer, following a suitable taxonomy, according to student’s cognitive skills. Although the main goal of this environment is to support students with more difficulties, it will provide a set of resources supporting the learning of more advanced topics. Software engineering techniques and representations, object orientation and event programming are features that will be available in order to promote the learning progress of students.

  1. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation......With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer’s......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  2. Integration of tools for binding archetypes to SNOMED CT.

    Science.gov (United States)

    Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan

    2008-10-27

    The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.

  3. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    Science.gov (United States)

    Chen, J.; Wu, Y.

    2012-01-01

    This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  4. Force feedback facilitates multisensory integration during robotic tool use

    NARCIS (Netherlands)

    Sengül, A.; Rognini, G.; van Elk, M.; Aspell, J.E.; Bleuler, H.; Blanke, O.

    2013-01-01

    The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal

  5. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    Science.gov (United States)

    Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania

    2007-05-01

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  6. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    Science.gov (United States)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  7. GAPIT: genome association and prediction integrated tool.

    Science.gov (United States)

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  8. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  9. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  10. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  11. Integrated Data Visualization and Virtual Reality Tool

    Science.gov (United States)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  12. Integrated modeling: a look back

    Science.gov (United States)

    Briggs, Clark

    2015-09-01

    This paper discusses applications and implementation approaches used for integrated modeling of structural systems with optics over the past 30 years. While much of the development work focused on control system design, significant contributions were made in system modeling and computer-aided design (CAD) environments. Early work appended handmade line-of-sight models to traditional finite element models, such as the optical spacecraft concept from the ACOSS program. The IDEAS2 computational environment built in support of Space Station collected a wider variety of existing tools around a parametric database. Later, IMOS supported interferometer and large telescope mission studies at JPL with MATLAB modeling of structural dynamics, thermal analysis, and geometric optics. IMOS's predecessor was a simple FORTRAN command line interpreter for LQG controller design with additional functions that built state-space finite element models. Specialized language systems such as CAESY were formulated and prototyped to provide more complex object-oriented functions suited to control-structure interaction. A more recent example of optical modeling directly in mechanical CAD is used to illustrate possible future directions. While the value of directly posing the optical metric in system dynamics terms is well understood today, the potential payoff is illustrated briefly via project-based examples. It is quite likely that integrated structure thermal optical performance (STOP) modeling could be accomplished in a commercial off-the-shelf (COTS) tool set. The work flow could be adopted, for example, by a team developing a small high-performance optical or radio frequency (RF) instrument.

  13. THE MANAGEMENT ACCOUNTING TOOLS AND THE INTEGRATED REPORTING

    Directory of Open Access Journals (Sweden)

    Gabriel JINGA

    2015-04-01

    Full Text Available During the recent years the stakeholders are asking for other pieces of information to be published along with the financial one, such as risk reporting, intangibles, social and environmental accounting. The type of corporate reporting which incorporates the elements enumerated above is the integrated reporting. In this article, we argue that the information disclosed in the integrated reports is prepared by the management accounting, not only by the financial accounting. Thus, we search for the management accounting tools which are used by the companies which prepare integrated reports. In order to do this, we analytically reviewed all the reports available on the website of a selected company. Our results show that the company is using most of the management accounting tools mentioned in the literature review part.

  14. MMM: A toolbox for integrative structure modeling.

    Science.gov (United States)

    Jeschke, Gunnar

    2018-01-01

    Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.

  15. Development of an integrated cost model for nuclear plant decommissioning

    International Nuclear Information System (INIS)

    Amos, G.; Roy, R.

    2003-01-01

    A need for an integrated cost estimating tool for nuclear decommissioning and associated waste processing and storage facilities for Intermediate Level Waste (ILW) was defined during the authors recent MSc studies. In order to close the defined gap a prototype tool was developed using logically derived CER's and cost driver variables. The challenge in developing this was to be able to produce a model that could produce realistic cost estimates from the limited levels of historic cost data that was available for analysis. The model is an excel based tool supported by 3 point risk estimating output and is suitable for producing estimates for strategic or optional cost estimates (±30%) early in the conceptual stage of a decommissioning project. The model was validated using minimal numbers of case studies supported by expert opinion discussion. The model provides an enhanced approach for integrated decommissioning estimates which will be produced concurrently with strategic options analysis on a nuclear site

  16. Biological data integration: wrapping data and tools.

    Science.gov (United States)

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  17. Competency-based evaluation tools for integrative medicine training in family medicine residency: a pilot study

    Directory of Open Access Journals (Sweden)

    Schneider Craig

    2007-04-01

    Full Text Available Abstract Background As more integrative medicine educational content is integrated into conventional family medicine teaching, the need for effective evaluation strategies grows. Through the Integrative Family Medicine program, a six site pilot program of a four year residency training model combining integrative medicine and family medicine training, we have developed and tested a set of competency-based evaluation tools to assess residents' skills in integrative medicine history-taking and treatment planning. This paper presents the results from the implementation of direct observation and treatment plan evaluation tools, as well as the results of two Objective Structured Clinical Examinations (OSCEs developed for the program. Methods The direct observation (DO and treatment plan (TP evaluation tools developed for the IFM program were implemented by faculty at each of the six sites during the PGY-4 year (n = 11 on DO and n = 8 on TP. The OSCE I was implemented first in 2005 (n = 6, revised and then implemented with a second class of IFM participants in 2006 (n = 7. OSCE II was implemented in fall 2005 with only one class of IFM participants (n = 6. Data from the initial implementation of these tools are described using descriptive statistics. Results Results from the implementation of these tools at the IFM sites suggest that we need more emphasis in our curriculum on incorporating spirituality into history-taking and treatment planning, and more training for IFM residents on effective assessment of readiness for change and strategies for delivering integrative medicine treatment recommendations. Focusing our OSCE assessment more narrowly on integrative medicine history-taking skills was much more effective in delineating strengths and weaknesses in our residents' performance than using the OSCE for both integrative and more basic communication competencies. Conclusion As these tools are refined further they will be of value both in improving

  18. Making eco logic and models work : An integrative approach to lake ecosystem modelling

    NARCIS (Netherlands)

    Kuiper, Jan Jurjen

    2016-01-01

    Dynamical ecosystem models are important tools that can help ecologists understand complex systems, and turn understanding into predictions of how these systems respond to external changes. This thesis revolves around PCLake, an integrated ecosystem model of shallow lakes that is used by both

  19. Consequence Based Design. An approach for integrating computational collaborative models (Integrated Dynamic Models) in the building design phase

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    relies on various advancements in the area of integrated dynamic models. It also relies on the application and test of the approach in practice to evaluate the Consequence based design and the use of integrated dynamic models. As a result, the Consequence based design approach has been applied in five...... and define new ways to implement integrated dynamic models for the following project. In parallel, seven different developments of new methods, tools and algorithms have been performed to support the application of the approach. The developments concern: Decision diagrams – to clarify goals and the ability...... affect the design process and collaboration between building designers and simulationists. Within the limits of applying the approach of Consequence based design to five case studies, followed by documentation based on interviews, surveys and project related documentations derived from internal reports...

  20. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  1. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Garcia, Humberto; Burr, Tom; Coles, Garill A.; Edmunds, Thomas A.; Garrett, Alfred; Gorensek, Maximilian; Hamm, Luther; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Tzanos, Constantine P.; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  2. Integration Of Facility Modeling Capabilities For Nuclear Nonproliferation Analysis

    International Nuclear Information System (INIS)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  3. Computational Design Tools for Integrated Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    In an architectural conceptual sketching process, where an architect is working with the initial ideas for a design, the process is characterized by three phases: sketching, evaluation and modification. Basically the architect needs to address three areas in the conceptual sketching phase......: aesthetical, functional and technical requirements. The aim of the present paper is to address the problem of a vague or not existing link between digital conceptual design tools used by architects and designers and engineering analysis and simulation tools. Based on an analysis of the architectural design...... process different digital design methods are related to tasks in an integrated design process....

  4. Integrating environmental component models. Development of a software framework

    NARCIS (Netherlands)

    Schmitz, O.

    2014-01-01

    Integrated models consist of interacting component models that represent various natural and social systems. They are important tools to improve our understanding of environmental systems, to evaluate cause–effect relationships of human–natural interactions, and to forecast the behaviour of

  5. Integrated Modelling - the next steps (Invited)

    Science.gov (United States)

    Moore, R. V.

    2010-12-01

    Integrated modelling (IM) has made considerable advances over the past decade but it has not yet been taken up as an operational tool in the way that its proponents had hoped. The reasons why will be discussed in Session U17. This talk will propose topics for a research and development programme and suggest an institutional structure which, together, could overcome the present obstacles. Their combined aim would be first to make IM into an operational tool useable by competent public authorities and commercial companies and, in time, to see it evolve into the modelling equivalent of Google Maps, something accessible and useable by anyone with a PC or an iphone and an internet connection. In a recent study, a number of government agencies, water authorities and utilities applied integrated modelling to operational problems. While the project demonstrated that IM could be used in an operational setting and had benefit, it also highlighted the advances that would be required for its widespread uptake. These were: greatly improving the ease with which models could be a) made linkable, b) linked and c) run; developing a methodology for applying integrated modelling; developing practical options for calibrating and validating linked models; addressing the science issues that arise when models are linked; extending the range of modelling concepts that can be linked; enabling interface standards to pass uncertainty information; making the interface standards platform independent; extending the range of platforms to include those for high performance computing; developing the concept of modelling components as web services; separating simulation code from the model’s GUI, so that all the results from the linked models can be viewed through a single GUI; developing scenario management systems so that that there is an audit trail of the version of each model and dataset used in each linked model run. In addition to the above, there is a need to build a set of integrated

  6. Mixed-Dimensionality VLSI-Type Configurable Tools for Virtual Prototyping of Biomicrofluidic Devices and Integrated Systems

    Science.gov (United States)

    Makhijani, Vinod B.; Przekwas, Andrzej J.

    2002-10-01

    This report presents results of a DARPA/MTO Composite CAD Project aimed to develop a comprehensive microsystem CAD environment, CFD-ACE+ Multiphysics, for bio and microfluidic devices and complete microsystems. The project began in July 1998, and was a three-year team effort between CFD Research Corporation, California Institute of Technology (CalTech), University of California, Berkeley (UCB), and Tanner Research, with Mr. Don Verlee from Abbott Labs participating as a consultant on the project. The overall objective of this project was to develop, validate and demonstrate several applications of a user-configurable VLSI-type mixed-dimensionality software tool for design of biomicrofluidics devices and integrated systems. The developed tool would provide high fidelity 3-D multiphysics modeling capability, l-D fluidic circuits modeling, and SPICE interface for system level simulations, and mixed-dimensionality design. It would combine tools for layouts and process fabrication, geometric modeling, and automated grid generation, and interfaces to EDA tools (e.g. Cadence) and MCAD tools (e.g. ProE).

  7. Challenges in horizontal model integration.

    Science.gov (United States)

    Kolczyk, Katrin; Conradi, Carsten

    2016-03-11

    Systems Biology has motivated dynamic models of important intracellular processes at the pathway level, for example, in signal transduction and cell cycle control. To answer important biomedical questions, however, one has to go beyond the study of isolated pathways towards the joint study of interacting signaling pathways or the joint study of signal transduction and cell cycle control. Thereby the reuse of established models is preferable, as it will generally reduce the modeling effort and increase the acceptance of the combined model in the field. Obtaining a combined model can be challenging, especially if the submodels are large and/or come from different working groups (as is generally the case, when models stored in established repositories are used). To support this task, we describe a semi-automatic workflow based on established software tools. In particular, two frequent challenges are described: identification of the overlap and subsequent (re)parameterization of the integrated model. The reparameterization step is crucial, if the goal is to obtain a model that can reproduce the data explained by the individual models. For demonstration purposes we apply our workflow to integrate two signaling pathways (EGF and NGF) from the BioModels Database.

  8. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  9. Optimal Vehicle Design Using the Integrated System and Cost Modeling Tool Suite

    Science.gov (United States)

    2010-08-01

    Space Vehicle Costing ( ACEIT ) • New Small Sat Model Development & Production Cost O&M Cost Module  Radiation Exposure  Radiation Detector Response...Reliability OML Availability Risk l l Tools CEA, SRM Model, POST, ACEIT , Inflation Model, Rotor Blade Des, Microsoft Project, ATSV, S/1-iABP...space STK, SOAP – Specific mission • Space Vehicle Design (SMAD) • Space Vehicle Propulsion • Orbit Propagation • Space Vehicle Costing ( ACEIT ) • New

  10. Effect of different machining processes on the tool surface integrity and fatigue life

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Chuan Liang [College of Mechanical and Electrical Engineering, Nanchang University, Nanchang (China); Zhang, Xianglin [School of Materials Science and Engineering, Huazhong University of Science and Technology, Wuhan (China)

    2016-08-15

    Ultra-precision grinding, wire-cut electro discharge machining and lapping are often used to machine the tools in fine blanking industry. And the surface integrity from these machining processes causes great concerns in the research field. To study the effect of processing surface integrity on the fine blanking tool life, the surface integrity of different tool materials under different processing conditions and its influence on fatigue life were thoroughly analyzed in the present study. The result shows that the surface integrity of different materials was quite different on the same processing condition. For the same tool material, the surface integrity on varying processing conditions was quite different too and deeply influenced the fatigue life.

  11. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  12. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    Science.gov (United States)

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  13. How to define the tool kit for the corrective maintenance service? : a tool kit definition model under the service performance criterion

    NARCIS (Netherlands)

    Chen, Denise

    2009-01-01

    Currently, the rule of defining tool kits is varied and more engineer's aspects oriented. However, the decision of the tool kit's definition is a trade-off problem between the cost and the service performance. This project is designed to develop a model that can integrate the engineer's preferences

  14. Enabling Integrated Decision Making for Electronic-Commerce by Modelling an Enterprise's Sharable Knowledge.

    Science.gov (United States)

    Kim, Henry M.

    2000-01-01

    An enterprise model, a computational model of knowledge about an enterprise, is a useful tool for integrated decision-making by e-commerce suppliers and customers. Sharable knowledge, once represented in an enterprise model, can be integrated by the modeled enterprise's e-commerce partners. Presents background on enterprise modeling, followed by…

  15. An Integrated Tool for Calculating and Reducing Institution Carbon and Nitrogen Footprints

    Science.gov (United States)

    Galloway, James N.; Castner, Elizabeth A.; Andrews, Jennifer; Leary, Neil; Aber, John D.

    2017-01-01

    Abstract The development of nitrogen footprint tools has allowed a range of entities to calculate and reduce their contribution to nitrogen pollution, but these tools represent just one aspect of environmental pollution. For example, institutions have been calculating their carbon footprints to track and manage their greenhouse gas emissions for over a decade. This article introduces an integrated tool that institutions can use to calculate, track, and manage their nitrogen and carbon footprints together. It presents the methodology for the combined tool, describes several metrics for comparing institution nitrogen and carbon footprint results, and discusses management strategies that reduce both the nitrogen and carbon footprints. The data requirements for the two tools overlap substantially, although integrating the two tools does necessitate the calculation of the carbon footprint of food. Comparison results for five institutions suggest that the institution nitrogen and carbon footprints correlate strongly, especially in the utilities and food sectors. Scenario analyses indicate benefits to both footprints from a range of utilities and food footprint reduction strategies. Integrating these two footprints into a single tool will account for a broader range of environmental impacts, reduce data entry and analysis, and promote integrated management of institutional sustainability. PMID:29350217

  16. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  17. SWIM (Soil and Water Integrated Model)

    Energy Technology Data Exchange (ETDEWEB)

    Krysanova, V; Wechsung, F; Arnold, J; Srinivasan, R; Williams, J

    2000-12-01

    The model SWIM (Soil and Water Integrated Model) was developed in order to provide a comprehensive GIS-based tool for hydrological and water quality modelling in mesoscale and large river basins (from 100 to 10,000 km{sup 2}), which can be parameterised using regionally available information. The model was developed for the use mainly in Europe and temperate zone, though its application in other regions is possible as well. SWIM is based on two previously developed tools - SWAT and MATSALU (see more explanations in section 1.1). The model integrates hydrology, vegetation, erosion, and nutrient dynamics at the watershed scale. SWIM has a three-level disaggregation scheme 'basin - sub-basins - hydrotopes' and is coupled to the Geographic Information System GRASS (GRASS, 1993). A robust approach is suggested for the nitrogen and phosphorus modelling in mesoscale watersheds. SWIM runs under the UNIX environment. Model test and validation were performed sequentially for hydrology, crop growth, nitrogen and erosion in a number of mesoscale watersheds in the German part of the Elbe drainage basin. A comprehensive scheme of spatial disaggregation into sub-basins and hydrotopes combined with reasonable restriction on a sub-basin area allows performing the assessment of water resources and water quality with SWIM in mesoscale river basins. The modest data requirements represent an important advantage of the model. Direct connection to land use and climate data provides a possibility to use the model for analysis of climate change and land use change impacts on hydrology, agricultural production, and water quality. (orig.)

  18. Integrated groundwater resource management in Indus Basin using satellite gravimetry and physical modeling tools.

    Science.gov (United States)

    Iqbal, Naveed; Hossain, Faisal; Lee, Hyongki; Akhter, Gulraiz

    2017-03-01

    Reliable and frequent information on groundwater behavior and dynamics is very important for effective groundwater resource management at appropriate spatial scales. This information is rarely available in developing countries and thus poses a challenge for groundwater managers. The in situ data and groundwater modeling tools are limited in their ability to cover large domains. Remote sensing technology can now be used to continuously collect information on hydrological cycle in a cost-effective way. This study evaluates the effectiveness of a remote sensing integrated physical modeling approach for groundwater management in Indus Basin. The Gravity Recovery and Climate Experiment Satellite (GRACE)-based gravity anomalies from 2003 to 2010 were processed to generate monthly groundwater storage changes using the Variable Infiltration Capacity (VIC) hydrologic model. The groundwater storage is the key parameter of interest for groundwater resource management. The spatial and temporal patterns in groundwater storage (GWS) are useful for devising the appropriate groundwater management strategies. GRACE-estimated GWS information with large-scale coverage is valuable for basin-scale monitoring and decision making. This frequently available information is found useful for the identification of groundwater recharge areas, groundwater storage depletion, and pinpointing of the areas where groundwater sustainability is at risk. The GWS anomalies were found to favorably agree with groundwater model simulations from Visual MODFLOW and in situ data. Mostly, a moderate to severe GWS depletion is observed causing a vulnerable situation to the sustainability of this groundwater resource. For the sustainable groundwater management, the region needs to implement groundwater policies and adopt water conservation techniques.

  19. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    Directory of Open Access Journals (Sweden)

    Huu-Tho Nguyen

    Full Text Available Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process and a fuzzy COmplex PRoportional ASsessment (COPRAS for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  20. Ground Vehicle System Integration (GVSI) and Design Optimization Model

    National Research Council Canada - National Science Library

    Horton, William

    1996-01-01

    This report documents the Ground Vehicle System Integration (GVSI) and Design Optimization Model GVSI is a top-level analysis tool designed to support engineering tradeoff studies and vehicle design optimization efforts...

  1. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  2. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  3. Adaptation in integrated assessment modeling: where do we stand?

    OpenAIRE

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We analyze how modelers have chosen to describe adaptation within an integrated framework, and suggest many ways they could improve the treatment of adaptation by considering more of its bottom-up cha...

  4. Integrating the strengths of cognitive emotion models with traditional HCI analysis tools

    OpenAIRE

    Springett, Mark; Law, Effie Lai-Chong; Coulson, Mark

    2015-01-01

    This paper reports an attempt to integrate key concepts from cognitive models of emotion to cognitive models of interaction established in HCI literature. The aim is to transfer the strengths of interaction models to analysis of affect-critical systems in games, e-commerce and education, thereby increasing their usefulness in these systems where affect is increasingly recognised as a key success factor. Concepts from Scherer’s appraisal model and stimulation evaluation checks, along with a fr...

  5. An integrated model of the lithium/thionyl chloride battery

    Energy Technology Data Exchange (ETDEWEB)

    Jungst, R.G.; Nagasubramanian, G.; Ingersoll, D.; O`Gorman, C.C.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States); Jain, M.; Weidner, J.W. [Univ. of South Carolina, Columbia, SC (United States)

    1998-06-08

    The desire to reduce the time and cost of design engineering on new components or to validate existing designs in new applications is stimulating the development of modeling and simulation tools. The authors are applying a model-based design approach to low and moderate rate versions of the Li/SOCl{sub 2} D-size cell with success. Three types of models are being constructed and integrated to achieve maximum capability and flexibility in the final simulation tool. A phenomenology based electrochemical model links performance and the cell design, chemical processes, and material properties. An artificial neural network model improves computational efficiency and fills gaps in the simulation capability when fundamental cell parameters are too difficult to measure or the forms of the physical relationships are not understood. Finally, a PSpice-based model provides a simple way to test the cell under realistic electrical circuit conditions. Integration of these three parts allows a complete link to be made between fundamental battery design characteristics and the performance of the rest of the electrical subsystem.

  6. Design Tools for Integrated Asynchronous Electronic Circuits

    National Research Council Canada - National Science Library

    Martin, Alain

    2003-01-01

    ..., simulation, verification, at the logical and physical levels. Situs has developed a business model for the commercialization of the CAD tools, and has designed the prototype of the tool suite based on this business model and the Caltech approach...

  7. An integrated urban drainage system model for assessing renovation scheme.

    Science.gov (United States)

    Dong, X; Zeng, S; Chen, J; Zhao, D

    2012-01-01

    Due to sustained economic growth in China over the last three decades, urbanization has been on a rapidly expanding track. In recent years, regional industrial relocations were also accelerated across the country from the east coast to the west inland. These changes have led to a large-scale redesign of urban infrastructures, including the drainage system. To help the reconstructed infrastructures towards a better sustainability, a tool is required for assessing the efficiency and environmental performance of different renovation schemes. This paper developed an integrated dynamic modeling tool, which consisted of three models for describing the sewer, the wastewater treatment plant (WWTP) and the receiving water body respectively. Three auxiliary modules were also incorporated to conceptualize the model, calibrate the simulations, and analyze the results. The developed integrated modeling tool was applied to a case study in Shenzhen City, which is one of the most dynamic cities and facing considerable challenges for environmental degradation. The renovation scheme proposed to improve the environmental performance of Shenzhen City's urban drainage system was modeled and evaluated. The simulation results supplied some suggestions for the further improvement of the renovation scheme.

  8. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  9. The systems integration modeling system

    International Nuclear Information System (INIS)

    Danker, W.J.; Williams, J.R.

    1990-01-01

    This paper discusses the systems integration modeling system (SIMS), an analysis tool for the detailed evaluation of the structure and related performance of the Federal Waste Management System (FWMS) and its interface with waste generators. It's use for evaluations in support of system-level decisions as to FWMS configurations, the allocation, sizing, balancing and integration of functions among elements, and the establishment of system-preferred waste selection and sequencing methods and other operating strategies is presented. SIMS includes major analysis submodels which quantify the detailed characteristics of individual waste items, loaded casks and waste packages, simulate the detailed logistics of handling and processing discrete waste items and packages, and perform detailed cost evaluations

  10. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  11. ANTHEM2000TM: Integration of the ANTHEM Thermal Hydraulic Model in the ROSETM Environment

    International Nuclear Information System (INIS)

    Boire, R.; Nguyen, M; Salim, G.

    1999-01-01

    ROSEN TM is an object oriented, visual programming environment used for many applications, including the development of power plant simulators. ROSE provides an integrated suite of tools for the creation, calibration, test, integration, configuration management and documentation of process, electrical and I and C models. CAE recently undertook an ambitious project to integrate its two phase thermal hydraulic model ANTHEM TM into the ROSE environment. ANTHEM is a non equilibrium, non-homogenous model based on the drift flux formalism. CAE has used the model in numerous two phase applications for nuclear and fossil power plant simulators. The integration of ANTHEM into ROSE brings the full power of visual based programming to two phase modeling applications. Features include graphical model building, calibration tools, a superior test environment and process visualisation. In addition the integration of ANTHEM into ROSE makes it possible to easily apply the fidelity of ANTHEM to BOP applications. This paper describes the implementation of the ANTHEM model within the ROSE environment and gives examples of its use. (author)

  12. Integration of g4tools in Geant4

    International Nuclear Information System (INIS)

    Hřivnáčová, Ivana

    2014-01-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  13. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  14. A model for integrated dictionaries of fixed expressions

    DEFF Research Database (Denmark)

    Bergenholtz, Henning; Bothma, Theo; Gouws, Rufus

    2011-01-01

    This paper discusses a project for the creation of a theoretical model for integrated e-dictionaries, illustrated by means of an e-information tool for the presentation and treatment of fixed expressions using Afrikaans as example language. To achieve this a database of fixed expressions...

  15. Integration of supervisory control synthesis in model-based systems engineering

    NARCIS (Netherlands)

    Baeten, J.C.M.; van de Mortel - Fronczak, J.M.; Rooda, J.E.

    2016-01-01

    Increasing system complexity, time to market and development costs reduction place higher demands on engineering processes. Formal models play an important role here because they enable the use of various model-based analyses and early integration techniques and tools. Engineering processes based on

  16. Integrated Land-Water-Energy assessment using the Foreseer Tool

    Science.gov (United States)

    Allwood, Julian; Konadu, Dennis; Mourao, Zenaida; Lupton, Rick; Richards, Keith; Fenner, Richard; Skelton, Sandy; McMahon, Richard

    2016-04-01

    This study presents an integrated energy and resource modelling and visualisation approach, ForeseerTM, which characterises the interdependencies and evaluates the land and water requirement for energy system pathways. The Foreseer Tool maps linked energy, water and land resource futures by outputting a set of Sankey diagrams for energy, water and land, showing the flow from basic resource (e.g. coal, surface water, and forested land) through transformations (e.g. fuel refining and desalination) to final services (e.g. sustenance, hygiene and transportation). By 'mapping' resources in this way, policy-makers can more easily understand the competing uses through the identification of the services it delivers (e.g. food production, landscaping, energy), the potential opportunities for improving the management of the resource and the connections with other resources which are often overlooked in a traditional sector-based management strategy. This paper will present a case study of the UK Carbon Plan, and highlights the need for integrated resource planning and policy development.

  17. An integrated simulation tool for analyzing the Operation and Interdependency of Natural Gas and Electric Power Systems

    OpenAIRE

    PAMBOUR Kwabena A.; CAKIR BURCIN; BOLADO LAVIN Ricardo; DIJKEMA Gerard

    2016-01-01

    In this paper, we present an integrated simulation tool for analyzing the interdependency of natural gas and electric power systems in terms of security of energy supply. In the first part, we develop mathematical models for the individual systems. In part two, we identify the interconnections between both systems and propose a method for coupling the combined simulation model. Next, we develop the algorithm for solving the combined system and integrate this algorithm into a simulation softwa...

  18. An Integrated Framework to Specify Domain-Specific Modeling Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2018-01-01

    , a logic-based specification language. The drawback of MS DSL Tools is it does not provide a formal and rigorous approach for semantics specifications. In this framework, we use Microsoft DSL Tools to define the metamodel and graphical notations of DSLs, and an extended version of ForSpec as a formal......In this paper, we propose an integrated framework that can be used by DSL designers to implement their desired graphical domain-specific languages. This framework relies on Microsoft DSL Tools, a meta-modeling framework to build graphical domain-specific languages, and an extension of ForSpec...... language to define their semantics. Integrating these technologies under the umbrella of Microsoft Visual Studio IDE allows DSL designers to utilize a single development environment for developing their desired domain-specific languages....

  19. Integrative structure modeling with the Integrative Modeling Platform.

    Science.gov (United States)

    Webb, Benjamin; Viswanath, Shruthi; Bonomi, Massimiliano; Pellarin, Riccardo; Greenberg, Charles H; Saltzberg, Daniel; Sali, Andrej

    2018-01-01

    Building models of a biological system that are consistent with the myriad data available is one of the key challenges in biology. Modeling the structure and dynamics of macromolecular assemblies, for example, can give insights into how biological systems work, evolved, might be controlled, and even designed. Integrative structure modeling casts the building of structural models as a computational optimization problem, for which information about the assembly is encoded into a scoring function that evaluates candidate models. Here, we describe our open source software suite for integrative structure modeling, Integrative Modeling Platform (https://integrativemodeling.org), and demonstrate its use. © 2017 The Protein Society.

  20. Integrated Model of Bioenergy and Agriculture System

    DEFF Research Database (Denmark)

    Sigurjonsson, Hafthor Ægir; Elmegaard, Brian; Clausen, Lasse Røngaard

    2015-01-01

    Due to increased burden on the environment caused by human activities, focus on industrial ecology designs are gaining more attention. In that perspective an environ- mentally effective integration of bionergy and agriculture systems has significant potential. This work introduces a modeling...... of the overall model. C- TOOL and Yasso07 are used in the carbon balance of agri- culture, Dynamic Network Analysis is used for the energy simulation and Brightway2 is used to build a Life Cycle Inventory compatible database and processes it for vari- ous impacts assessment methods. The model is success- fully...... approach that builds on Life Cycle Inventory and carries out Life Cycle Impact Assessment for a con- sequential Life Cycle Assessment on integrated bioenergy and agriculture systems. The model framework is built in Python which connects various freely available soft- ware that handle different aspects...

  1. Knowledge modelling and reliability processing: presentation of the Figaro language and associated tools

    International Nuclear Information System (INIS)

    Bouissou, M.; Villatte, N.; Bouhadana, H.; Bannelier, M.

    1991-12-01

    EDF has been developing for several years an integrated set of knowledge-based and algorithmic tools for automation of reliability assessment of complex (especially sequential) systems. In this environment, the reliability expert has at his disposal all the powerful software tools for qualitative and quantitative processing, besides he gets various means to generate automatically the inputs for these tools, through the acquisition of graphical data. The development of these tools has been based on FIGARO, a specific language, which was built to get an homogeneous system modelling. Various compilers and interpreters get a FIGARO model into conventional models, such as fault-trees, Markov chains, Petri Networks. In this report, we introduce the main basics of FIGARO language, illustrating them with examples

  2. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    Science.gov (United States)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  3. Development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool: An Evidence-Based Model for School Garden Integration.

    Science.gov (United States)

    Burt, Kate Gardner; Koch, Pamela; Contento, Isobel

    2017-10-01

    Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to

  4. Integrated Environmental Assessment Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Guardanz, R; Gimeno, B S; Bermejo, V; Elvira, S; Martin, F; Palacios, M; Rodriguez, E; Donaire, I [Ciemat, Madrid (Spain)

    2000-07-01

    This report describes the results of the Spanish participation in the project Coupling CORINAIR data to cost-effect emission reduction strategies based on critical threshold. (EU/LIFE97/ENV/FIN/336). The subproject has focused on three tasks. Develop tools to improve knowledge on the spatial and temporal details of emissions of air pollutants in Spain. Exploit existing experimental information on plant response to air pollutants in temperate ecosystem and Integrate these findings in a modelling framework that can asses with more accuracy the impact of air pollutants to temperate ecosystems. The results obtained during the execution of this project have significantly improved the models of the impact of alternative emission control strategies on ecosystems and crops in the Iberian Peninsula. (Author) 375 refs.

  5. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  6. Freiburg RNA Tools: a web server integrating INTARNA, EXPARNA and LOCARNA.

    Science.gov (United States)

    Smith, Cameron; Heyne, Steffen; Richter, Andreas S; Will, Sebastian; Backofen, Rolf

    2010-07-01

    The Freiburg RNA tools web server integrates three tools for the advanced analysis of RNA in a common web-based user interface. The tools IntaRNA, ExpaRNA and LocARNA support the prediction of RNA-RNA interaction, exact RNA matching and alignment of RNA, respectively. The Freiburg RNA tools web server and the software packages of the stand-alone tools are freely accessible at http://rna.informatik.uni-freiburg.de.

  7. Tools for Resilience Management: Multidisciplinary Development of State-and-Transition Models for Northwest Colorado

    Directory of Open Access Journals (Sweden)

    Emily J. Kachergis

    2013-12-01

    Full Text Available Building models is an important way of integrating knowledge. Testing and updating models of social-ecological systems can inform management decisions and, ultimately, improve resilience. We report on the outcomes of a six-year, multidisciplinary model development process in the sagebrush steppe, USA. We focused on creating state-and-transition models (STMs, conceptual models of ecosystem change that represent nonlinear dynamics and are being adopted worldwide as tools for managing ecosystems. STM development occurred in four steps with four distinct sets of models: (1 local knowledge elicitation using semistructured interviews; (2 ecological data collection using an observational study; (3 model integration using participatory workshops; and (4 model simplification upon review of the literature by a multidisciplinary team. We found that different knowledge types are ultimately complementary. Many of the benefits of the STM-building process flowed from the knowledge integration steps, including improved communication, identification of uncertainties, and production of more broadly credible STMs that can be applied in diverse situations. The STM development process also generated hypotheses about sagebrush steppe dynamics that could be tested by future adaptive management and research. We conclude that multidisciplinary development of STMs has great potential for producing credible, useful tools for managing resilience of social-ecological systems. Based on this experience, we outline a streamlined, participatory STM development process that integrates multiple types of knowledge and incorporates adaptive management.

  8. A new tool for man/machine integration

    International Nuclear Information System (INIS)

    Sommer, W.C.

    1981-01-01

    A popular term within the nuclear power industry today, as a result of TMI, is man/machine interface. It has been determined that greater acknowledgement of this interface is necessary within the industry to integrate the design and operational aspects of a system. What is required is an operational tool that can be used early in the engineering stages of a project and passed on later in time to those who will be responsible to operate that particular system. This paper discusses one such fundamental operations tool that is applied to a process system, its display devices, and its operator actions in a methodical fashion to integrate the machine for man's understanding and proper use. This new tool, referred to as an Operational Schematic, is shown and described. Briefly, it unites, in one location, the important operational display devices with the system process devices. A man can now see the beginning and end of each information and control loop to better understand its function within the system. A method is presented whereby in designing for operability, the schematic is utilized in three phases. The method results in two basic documents, one describes ''what'' is to be operated and the other ''how'' it is to be operated. This integration concept has now considered the hardware spectrum from sensor-to-display and operated the display (on paper) to confirm its operability. Now that the design aspects are complete, the later-in-time operational aspects need to be addressed for the man using the process system. Training personnel in operating and testing the process system is as important as the original design. To accomplish these activities, documents are prepared to instruct personnel how to operate (and test) the system under a variety of circumstances

  9. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  10. Integrative change model in psychotherapy: Perspectives from Indian thought.

    Science.gov (United States)

    Manickam, L S S

    2013-01-01

    Different psychotherapeutic approaches claim positive changes in patients as a result of therapy. Explanations related to the change process led to different change models. Some of the change models are experimentally oriented whereas some are theoretical. Apart from the core models of behavioral, psychodynamic, humanistic, cognitive and spiritually oriented models there are specific models, within psychotherapy that explains the change process. Integrative theory of a person as depicted in Indian thought provides a common ground for the integration of various therapies. Integrative model of change based on Indian thought, with specific reference to psychological concepts in Upanishads, Ayurveda, Bhagavad Gita and Yoga are presented. Appropriate psychological tools may be developed in order to help the clinicians to choose the techniques that match the problem and the origin of the dimension. Explorations have to be conducted to develop more techniques that are culturally appropriate and clinically useful. Research has to be initiated to validate the identified concepts.

  11. Uni- and omnidirectional simulation tools for integrated optics

    NARCIS (Netherlands)

    Stoffer, Remco

    2001-01-01

    This thesis presents several improvements on simulation methods in integrated optics, as well as some new methods. Both uni- and omnidirectional tools are presented; for the unidirectional methods, the emphasis is on higher-order accuracy; for the omnidirectional methods, the boundary conditions are

  12. State-of-the-art Review : Vol. 2B. Methods and Tools for Designing Integrated Building Concepts

    DEFF Research Database (Denmark)

    van der Aa, Ad; Andresen, Inger; Asada, Hideo

    of integrated building concepts and responsive building elements. At last, the report gives a description of uncertainty modelling in building performance assessment. The descriptions of the design methods and tools include an explanation of how the methods may be applied, any experiences gained by using...

  13. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  14. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  15. MoGIRE: A Model for Integrated Water Management

    Science.gov (United States)

    Reynaud, A.; Leenhardt, D.

    2008-12-01

    Climate change and growing water needs have resulted in many parts of the world in water scarcity problems that must by managed by public authorities. Hence, policy-makers are more and more often asked to define and to implement water allocation rules between competitive users. This requires to develop new tools aiming at designing those rules for various scenarios of context (climatic, agronomic, economic). If models have been developed for each type of water use however, very few integrated frameworks link these different uses, while such an integrated approach is a relevant stake for designing regional water and land policies. The lack of such integrated models can be explained by the difficulty of integrating models developed by very different disciplines and by the problem of scale change (collecting data on large area, arbitrate between the computational tractability of models and their level of aggregation). However, modelers are more and more asked to deal with large basin scales while analyzing some policy impacts at very high detailed levels. These contradicting objectives require to develop new modeling tools. The CALVIN economically-driven optimization model developed for managing water in California is a good example of this type of framework, Draper et al. (2003). Recent reviews of the literature on integrated water management at the basin level include Letcher et al. (2007) or Cai (2008). We present here an original framework for integrated water management at the river basin scale called MoGIRE ("Modèle pour la Gestion Intégrée de la Ressource en Eau"). It is intended to optimize water use at the river basin level and to evaluate scenarios (agronomic, climatic or economic) for a better planning of agricultural and non-agricultural water use. MoGIRE includes a nodal representation of the water network. Agricultural, urban and environmental water uses are also represented using mathematical programming and econometric approaches. The model then

  16. Risk Informed Design Using Integrated Vehicle Rapid Assessment Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — A successful proof of concept was performed in FY 2012 integrating the Envision tool for parametric estimates of vehicle mass and the Rapid Response Risk Assessment...

  17. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  18. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  19. GEMMER: GEnome-wide tool for Multi-scale Modeling data Extraction and Representation for Saccharomyces cerevisiae.

    Science.gov (United States)

    Mondeel, Thierry D G A; Crémazy, Frédéric; Barberis, Matteo

    2018-02-01

    Multi-scale modeling of biological systems requires integration of various information about genes and proteins that are connected together in networks. Spatial, temporal and functional information is available; however, it is still a challenge to retrieve and explore this knowledge in an integrated, quick and user-friendly manner. We present GEMMER (GEnome-wide tool for Multi-scale Modelling data Extraction and Representation), a web-based data-integration tool that facilitates high quality visualization of physical, regulatory and genetic interactions between proteins/genes in Saccharomyces cerevisiae. GEMMER creates network visualizations that integrate information on function, temporal expression, localization and abundance from various existing databases. GEMMER supports modeling efforts by effortlessly gathering this information and providing convenient export options for images and their underlying data. GEMMER is freely available at http://gemmer.barberislab.com. Source code, written in Python, JavaScript library D3js, PHP and JSON, is freely available at https://github.com/barberislab/GEMMER. M.Barberis@uva.nl. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  20. Towards integrated solutions for water, energy, and land using an integrated nexus modeling framework

    Science.gov (United States)

    Wada, Y.

    2017-12-01

    Humanity has already reached or even exceeded the Earth's carrying capacity. Growing needs for food, energy and water will only exacerbate existing challenges over the next decades. Consequently, the acceptance of "business as usual" is eroding and we are being challenged to adopt new, more integrated, and more inclusive development pathways that avoid dangerous interference with the local environment and global planetary boundaries. This challenge is embodied in the United Nation's Sustainable Development Goals (SDGs), which endeavor to set a global agenda for moving towards more sustainable development strategies. To improve and sustain human welfare, it is critical that access to modern, reliable, and affordable water, energy, and food is expanded and maintained. The Integrated Solutions for Water, Energy, and Land (IS-WEL) project has been launched by IIASA, together with the Global Environment Facility (GEF) and the United Nations Industrial Development Organization (UNIDO). This project focuses on the water-energy-land nexus in the context of other major global challenges such as urbanization, environmental degradation, and equitable and sustainable futures. It develops a consistent framework for looking at the water-energy-land nexus and identify strategies for achieving the needed transformational outcomes through an advanced assessment framework. A multi-scalar approach are being developed that aims to combine global and regional integrated assessment tools with local stakeholder knowledge in order to identify robust solutions to energy, water, food, and ecosystem security in selected regions of the world. These are regions facing multiple energy, water and land use challenges and rapid demographic and economic changes, and are hardest hit by increasing climate variability and change. This project combines the global integrated assessment model (MESSAGE) with the global land (GLOBIOM) and water (Community Water Model) model respectively, and the integrated

  1. The Integrated Medical Model: A Risk Assessment and Decision Support Tool for Space Flight Medical Systems

    Science.gov (United States)

    Kerstman, Eric; Minard, Charles; Saile, Lynn; deCarvalho, Mary Freire; Myers, Jerry; Walton, Marlei; Butler, Douglas; Iyengar, Sriram; Johnson-Throop, Kathy; Baumann, David

    2009-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to mission planners and medical system designers in assessing risks and designing medical systems for space flight missions. The IMM provides an evidence based approach for optimizing medical resources and minimizing risks within space flight operational constraints. The mathematical relationships among mission and crew profiles, medical condition incidence data, in-flight medical resources, potential crew functional impairments, and clinical end-states are established to determine probable mission outcomes. Stochastic computational methods are used to forecast probability distributions of crew health and medical resource utilization, as well as estimates of medical evacuation and loss of crew life. The IMM has been used in support of the International Space Station (ISS) medical kit redesign, the medical component of the ISS Probabilistic Risk Assessment, and the development of the Constellation Medical Conditions List. The IMM also will be used to refine medical requirements for the Constellation program. The IMM outputs for ISS and Constellation design reference missions will be presented to demonstrate the potential of the IMM in assessing risks, planning missions, and designing medical systems. The implementation of the IMM verification and validation plan will be reviewed. Additional planned capabilities of the IMM, including optimization techniques and the inclusion of a mission timeline, will be discussed. Given the space flight constraints of mass, volume, and crew medical training, the IMM is a valuable risk assessment and decision support tool for medical system design and mission planning.

  2. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Science.gov (United States)

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  3. Adaptation in integrated assessment modeling: where do we stand?

    NARCIS (Netherlands)

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We

  4. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib, E-mail: garib@mrc-lmb.cam.ac.uk [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge CB2 0QH (United Kingdom)

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC.

  5. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    International Nuclear Information System (INIS)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC

  6. Integrated fuel-cycle models for fast breeder reactors

    International Nuclear Information System (INIS)

    Ott, K.O.; Maudlin, P.J.

    1981-01-01

    Breeder-reactor fuel-cycle analysis can be divided into four different areas or categories. The first category concerns questions about the spatial variation of the fuel composition for single loading intervals. Questions of the variations in the fuel composition over several cycles represent a second category. Third, there is a need for a determination of the breeding capability of the reactor. The fourth category concerns the investigation of breeding and long-term fuel logistics. Two fuel-cycle models used to answer questions in the third and fourth area are presented. The space- and time-dependent actinide balance, coupled with criticality and fuel-management constraints, is the basis for both the Discontinuous Integrated Fuel-Cycle Model and the Continuous Integrated Fuel-Cycle Model. The results of the continuous model are compared with results obtained from detailed two-dimensional space and multigroup depletion calculations. The continuous model yields nearly the same results as the detailed calculation, and this is with a comparatively insignificant fraction of the computational effort needed for the detailed calculation. Thus, the integrated model presented is an accurate tool for answering questions concerning reactor breeding capability and long-term fuel logistics. (author)

  7. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  8. Integrated digital planning and management tools - the example of buildings; Integrierte digitale Planungs- und Management-Werkzeuge - am Beispiel von Gebaeuden

    Energy Technology Data Exchange (ETDEWEB)

    Gauchel, J. [WIBc, Objektorientierte Systeme fuer Gebaeudeplanung und -management, Karlsruhe (Germany)

    1995-12-31

    In chapter 4 of the anthology about building control integrated digital planning and management tools are described. Integrated planning and management tools are task-orientated data processing applications, which are integrated into a data management and a common user-interface. The following aspects are discussed: realistic modelling, CAD-systems, object-orientated system architecture as well as building management. (BWI) [Deutsch] Kapitel 4 des Sammelbandes ueber Building Control ist Integrierten digitalen Planungs- und Managementwerkzeugen gewindmet. Unter Integrierten Planungs- und Managementwerkzeugen werden dabei aufgabenorientierte EDV-Anwendungen verstanden, die in eine gemeinsame Datenverwaltung und ein gemeinsames Nutzer-Interface eingebunden sind. Folgende Themen werden angesprochen: Realistische Modellierung, CAD-Systeme, objektorientierte Systemarchitekturen sowie Gebaeudemanagement. (BWI)

  9. Integrated waste management and the tool of life cycle inventory : a route to sustainable waste management

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, F.R.; White, P.R. [Procter and Gamble Newcastle Technical Centre, Newcastle (United Kingdom). Corporate Sustainable Development

    2000-07-01

    An overall approach to municipal waste management which integrates sustainable development principles was discussed. The three elements of sustainability which have to be balanced are environmental effectiveness, economic affordability and social acceptability. An integrated waste management (IWM) system considers different treatment options and deals with the entire waste stream. A life cycle inventory (LCI) and life cycle assessment (LCA) is used to determine the environmental burdens associated with IWM systems. LCIs for waste management are currently available for use in Europe, the United States, Canada and elsewhere. LCI is being used by waste management companies to assess the environmental attributes of future contract tenders. The models are used as benchmarking tools to assess the current environmental profile of a waste management system. They are also a comparative planning and communication tool. The authors are currently looking into publishing, at a future date, the experience of users of this LCI environmental management tool. 12 refs., 3 figs.

  10. An Integrated Development Tool for a safety application using FBD language

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jun; Lee, Jang Soo; Lee, Dong Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    Regarding digitalizing the Nuclear Instrumentation and Control Systems, the application program responsible for the safety functions of Nuclear I and C Systems shall ensure the robustness of the safety function through development, testing, and validation roles for a life cycle process during software development. The importance of software in nuclear systems increases continuously. The integrated engineering tools to develop, test, and validate safety application programs require increasingly more complex parts among a number of components within nuclear digital I and C systems. This paper introduces the integrated engineering tool (SafeCASE-PLC) developed by our project. The SafeCASE-PLC is a kind of software engineering tool to develop, test, and validate the nuclear application program performed in an automatic controller

  11. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  12. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  13. OPTIMAL MODEL OF FUNCTIONING OF OLERICULTURE: VERTICAL INTEGRATION, AGRICULTURAL FILIERES, CLUSTERS

    Directory of Open Access Journals (Sweden)

    Y. B. Mindlin

    2016-01-01

    Full Text Available The goal of the present paper is to identify the optimal strategy of development of the Russian olericulture in order to substitute imported products and to build up logistic and transport infrastructure. Existing problems of the Russian olericulture are described. It is demonstrated that these problems can be solved on the basis of big integrated structures. Formation of these structures can be based on hierarchical (vertical  integration or networking (agricultural filieres or clusters models. A comparative analysis of these models of development of olericulture is made. Advantages and inconveniences of each model are described. It is demonstrated that sustainable development of the Russian olericulture can be insured only by a combination of hierarchical and networking tools. Vertical integration will help to reach quick increase of production, while networking models are necessary for inclusion of small producers into production chains, development of product range and development of supporting industries. Networking models are also necessary for social tasks. It means that the optimal strategy of development of the  Russian olericulture should be based on a combination of networking and hierarchical tools. This combination is necessary for agricultural corporation as well as for the Russian olericulture in general.

  14. Integrated city as a model for a new wave urban tourism

    Science.gov (United States)

    Ariani, V.

    2018-03-01

    Cities are a major player for an urban tourism destination. Massive tourism movement for urban tourism gains competitiveness to the city with similar characteristic. The new framework model for new wave urban tourism is crucial to give more experience to the tourist and valuing for the city itself. The integrated city is the answer for creating a new model for an urban tourism destination. The purpose of this preliminary research is to define integrated city framework for urban tourism development. It provides a rationale for tourism planner pursuing an innovative approach, competitive advantages, and general urban tourism destination model. The methodology applies to this research includes desk survey, literature review and focus group discussion. A conceptual framework is proposed, discussed and exemplified. The framework model adopts a place-based approach to tourism destination and suggests an integrated city model for urban tourism development. This model is a tool for strategy making in re-invention integrated city as an urban tourism destination.

  15. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  16. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  17. Repository Integration Program: RIP performance assessment and strategy evaluation model theory manual and user's guide

    International Nuclear Information System (INIS)

    1995-11-01

    This report describes the theory and capabilities of RIP (Repository Integration Program). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ''big picture'' and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a '' top down'' approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ''top down'' approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers

  18. Theory, modeling, and integrated studies in the Arase (ERG) project

    Science.gov (United States)

    Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa

    2018-02-01

    Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.

  19. Integrating satellite imagery with simulation modeling to improve burn severity mapping

    Science.gov (United States)

    Eva C. Karau; Pamela G. Sikkink; Robert E. Keane; Gregory K. Dillon

    2014-01-01

    Both satellite imagery and spatial fire effects models are valuable tools for generating burn severity maps that are useful to fire scientists and resource managers. The purpose of this study was to test a new mapping approach that integrates imagery and modeling to create more accurate burn severity maps. We developed and assessed a statistical model that combines the...

  20. IDMT, Integrated Decommissioning Management Tools

    International Nuclear Information System (INIS)

    Alemberti, A.; Castagna, P.; Marsiletti, M.; Orlandi, S.; Perasso, L.; Susco, M.

    2005-01-01

    Nuclear Power Plant decommissioning requires a number of demolition activities related to civil works and systems as well as the construction of temporary facilities used for treatment and conditioning of the dismantled parts. The presence of a radiological, potentially hazardous, environment due to the specific configuration and history of the plant require a professional, expert and qualified approach approved by the national safety authority. Dismantling activities must be designed, planned and analysed in detail during an evaluation phase taking into account different scenarios generated by possible dismantling sequences and specific waste treatments to be implemented. The optimisation process of the activities becomes very challenging taking into account the requirement of the minimisation of the radiological impact on exposed workers and people during normal and accident conditions. While remote operated equipment, waste treatment and conditioning facilities may be designed taking into account this primary goal also a centralised management system and corresponding software tools have to be designed and operated in order to guarantee the fulfilment of the imposed limits as well as the traceability of wastes. Ansaldo Nuclear Division has been strongly involved in the development of a qualified and certified software environment to manage the most critical activities of a decommissioning project. The IDMT system (Integrated Decommissioning Management Tools) provide a set of stand alone user friendly applications able to work in an integrated configuration to guarantee waste identification, traceability during treatment and conditioning process as well as location and identification at the Final Repository site. Additionally, the system can be used to identify, analyse and compare different specific operating scenarios to be optimised in term of both economical and radiological considerations. The paper provides an overview of the different phases of

  1. Integrated Monitoring and Modeling of Carbon Dioxide Leakage Risk Using Remote Sensing, Ground-Based Monitoring, Atmospheric Models and Risk-Indexing Tools

    Science.gov (United States)

    Burton, E. A.; Pickles, W. L.; Gouveia, F. J.; Bogen, K. T.; Rau, G. H.; Friedmann, J.

    2006-12-01

    estimating its associated risk, spatially and temporally. This requires integration of subsurface, surface and atmospheric data and models. To date, we have developed techniques to map risk based on predicted atmospheric plumes and GIS/MT (meteorologic- topographic) risk-indexing tools. This methodology was derived from study of large CO2 releases from an abandoned well penetrating a natural CO2 reservoir at Crystal Geyser, Utah. This integrated approach will provide a powerful tool to screen for high-risk zones at proposed sequestration sites, to design and optimize surface networks for site monitoring and/or to guide setting science-based regulatory compliance requirements for monitoring sequestration sites, as well as to target critical areas for first responders should a catastrophic-release event occur. This work was performed under the auspices of the U.S. Dept. of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  2. Data Integration for the Generation of High Resolution Reservoir Models

    Energy Technology Data Exchange (ETDEWEB)

    Albert Reynolds; Dean Oliver; Gaoming Li; Yong Zhao; Chaohui Che; Kai Zhang; Yannong Dong; Chinedu Abgalaka; Mei Han

    2009-01-07

    The goal of this three-year project was to develop a theoretical basis and practical technology for the integration of geologic, production and time-lapse seismic data in a way that makes best use of the information for reservoir description and reservoir performance predictions. The methodology and practical tools for data integration that were developed in this research project have been incorporated into computational algorithms that are feasible for large scale reservoir simulation models. As the integration of production and seismic data require calibrating geological/geostatistical models to these data sets, the main computational tool is an automatic history matching algorithm. The following specific goals were accomplished during this research. (1) We developed algorithms for calibrating the location of the boundaries of geologic facies and the distribution of rock properties so that production and time-lapse seismic data are honored. (2) We developed and implemented specific procedures for conditioning reservoir models to time-lapse seismic data. (3) We developed and implemented algorithms for the characterization of measurement errors which are needed to determine the relative weights of data when conditioning reservoir models to production and time-lapse seismic data by automatic history matching. (4) We developed and implemented algorithms for the adjustment of relative permeability curves during the history matching process. (5) We developed algorithms for production optimization which accounts for geological uncertainty within the context of closed-loop reservoir management. (6) To ensure the research results will lead to practical public tools for independent oil companies, as part of the project we built a graphical user interface for the reservoir simulator and history matching software using Visual Basic.

  3. Latest Community Coordinated Modeling Center (CCMC) services and innovative tools supporting the space weather research and operational communities.

    Science.gov (United States)

    Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.

    2017-12-01

    The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).

  4. QFD: a methodological tool for integration of ergonomics at the design stage.

    Science.gov (United States)

    Marsot, Jacques

    2005-03-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute launched in 1999 a research program on the topic of integrating ergonomics into hand tool design. After a brief review of the problems of integrating ergonomics at the design stage, the paper shows how the "Quality Function Deployment" method has been applied to the design of a boning knife and it highlights the difficulties encountered. Then, it demonstrates how this method can be a methodological tool geared to greater ergonomics consideration in product design.

  5. A Practical Review of Integrated Urban Water Models: Applications as Decision Support Tools and Beyond

    Science.gov (United States)

    Mosleh, L.; Negahban-Azar, M.

    2017-12-01

    The integrated urban water management has become a necessity due to the high rate of urbanization, water scarcity, and climate variability. Climate and demographic changes, shifting the social attitude toward the water usage, and insufficiencies in system resilience increase the pressure on the water resources. Alongside with the water management, modeling urban water systems have progressed from traditional view to comprise alternatives such as decentralized water and wastewater systems, fit-for-purpose practice, graywater/rainwater reuse, and green infrastructure. While there are review papers available focusing on the technical part of the models, they seem to be more beneficial for model developers. Some of the models analyze a number of scenarios considering factors such as climate change and demography and their future impacts. However, others only focus on quality and quantity of water in a supply/demand approach. For example, optimizing the size of water or waste water store, characterizing the supply and quantity of urban stormwater and waste water, and link source of water to demand. A detailed and practical comparison of such models has become a necessity for the practitioner and policy makers. This research compares more than 7 most commonly used integrated urban water cycle models and critically reviews their capabilities, input requirements, output and their applications. The output of such detailed comparison will help the policy makers for the decision process in the built environment to compare and choose the best models that meet their goals. The results of this research show that we need a transition from developing/using integrated water cycle models to integrated system models which incorporate urban water infrastructures and ecological and economic factors. Such models can help decision makers to reflect other important criteria but with the focus on urban water management. The research also showed that there is a need in exploring

  6. A System Dynamics Model for Integrated Decision Making ...

    Science.gov (United States)

    EPA’s Sustainable and Healthy Communities Research Program (SHC) is conducting transdisciplinary research to inform and empower decision-makers. EPA tools and approaches are being developed to enable communities to effectively weigh and integrate human health, socioeconomic, environmental, and ecological factors into their decisions to promote community sustainability. To help achieve this goal, EPA researchers have developed systems approaches to account for the linkages among resources, assets, and outcomes managed by a community. System dynamics (SD) is a member of the family of systems approaches and provides a framework for dynamic modeling that can assist with assessing and understanding complex issues across multiple dimensions. To test the utility of such tools when applied to a real-world situation, the EPA has developed a prototype SD model for community sustainability using the proposed Durham-Orange Light Rail Project (D-O LRP) as a case study.The EPA D-O LRP SD modeling team chose the proposed D-O LRP to demonstrate that an integrated modeling approach could represent the multitude of related cross-sectoral decisions that would be made and the cascading impacts that could result from a light rail transit system connecting Durham and Chapel Hill, NC. In keeping with the SHC vision described above, the proposal for the light rail is a starting point solution for the more intractable problems of population growth, unsustainable land use, environmenta

  7. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  8. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    Science.gov (United States)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  9. The integration of FMEA with other problem solving tools: A review of enhancement opportunities

    Science.gov (United States)

    Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.

    2017-09-01

    Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.

  10. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...... control with the building designer. Consequence based design is defined by the specific use of integrated dynamic modeling, which includes the parametric capabilities of a scripting tool and building simulation features of a building performance simulation tool. The framework can lead to enhanced...

  11. Advertising Can Be an Effective Integrated Marketing Tool

    Science.gov (United States)

    Lauer, Larry D.

    2007-01-01

    Advertising will not undermine the critical thinking of consumers when it is combined with other communication media, and when it is truthful. In fact, it can provide clarity about the competitive advantage of individual institutions and aid an individual's ability to choose wisely. Advertising is just one of the tools in the integrated marketing…

  12. Computer-aided operations engineering with integrated models of systems and operations

    Science.gov (United States)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    CONFIG 3 is a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operation of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. Integration is supported among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. Support is provided for integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems. CONFIG supports abstracted qualitative and symbolic modeling, for early conceptual design. System models are component structure models with operating modes, with embedded time-related behavior models. CONFIG supports failure modeling and modeling of state or configuration changes that result in dynamic changes in dependencies among components. Operations and procedure models are activity structure models that interact with system models. CONFIG is designed to support evaluation of system operability, diagnosability and fault tolerance, and analysis of the development of system effects of problems over time, including faults, failures, and procedural or environmental difficulties.

  13. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  14. Useful tools for non-linear systems: Several non-linear integral inequalities

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mohammadpour, A.; Mesiar, Radko; Vaezpour, M. S.

    2013-01-01

    Roč. 49, č. 1 (2013), s. 73-80 ISSN 0950-7051 R&D Projects: GA ČR GAP402/11/0378 Institutional support: RVO:67985556 Keywords : Monotone measure * Comonotone functions * Integral inequalities * Universal integral Subject RIV: BA - General Mathematics Impact factor: 3.058, year: 2013 http://library.utia.cas.cz/separaty/2013/E/mesiar-useful tools for non-linear systems several non-linear integral inequalities.pdf

  15. Model-based sensorimotor integration for multi-joint control: development of a virtual arm model.

    Science.gov (United States)

    Song, D; Lan, N; Loeb, G E; Gordon, J

    2008-06-01

    An integrated, sensorimotor virtual arm (VA) model has been developed and validated for simulation studies of control of human arm movements. Realistic anatomical features of shoulder, elbow and forearm joints were captured with a graphic modeling environment, SIMM. The model included 15 musculotendon elements acting at the shoulder, elbow and forearm. Muscle actions on joints were evaluated by SIMM generated moment arms that were matched to experimentally measured profiles. The Virtual Muscle (VM) model contained appropriate admixture of slow and fast twitch fibers with realistic physiological properties for force production. A realistic spindle model was embedded in each VM with inputs of fascicle length, gamma static (gamma(stat)) and dynamic (gamma(dyn)) controls and outputs of primary (I(a)) and secondary (II) afferents. A piecewise linear model of Golgi Tendon Organ (GTO) represented the ensemble sampling (I(b)) of the total muscle force at the tendon. All model components were integrated into a Simulink block using a special software tool. The complete VA model was validated with open-loop simulation at discrete hand positions within the full range of alpha and gamma drives to extrafusal and intrafusal muscle fibers. The model behaviors were consistent with a wide variety of physiological phenomena. Spindle afferents were effectively modulated by fusimotor drives and hand positions of the arm. These simulations validated the VA model as a computational tool for studying arm movement control. The VA model is available to researchers at website http://pt.usc.edu/cel .

  16. Vertically Integrated Models for Carbon Storage Modeling in Heterogeneous Domains

    Science.gov (United States)

    Bandilla, K.; Celia, M. A.

    2017-12-01

    Numerical modeling is an essential tool for studying the impacts of geologic carbon storage (GCS). Injection of carbon dioxide (CO2) into deep saline aquifers leads to multi-phase flow (injected CO2 and resident brine), which can be described by a set of three-dimensional governing equations, including mass-balance equation, volumetric flux equations (modified Darcy), and constitutive equations. This is the modeling approach on which commonly used reservoir simulators such as TOUGH2 are based. Due to the large density difference between CO2 and brine, GCS models can often be simplified by assuming buoyant segregation and integrating the three-dimensional governing equations in the vertical direction. The integration leads to a set of two-dimensional equations coupled with reconstruction operators for vertical profiles of saturation and pressure. Vertically-integrated approaches have been shown to give results of comparable quality as three-dimensional reservoir simulators when applied to realistic CO2 injection sites such as the upper sand wedge at the Sleipner site. However, vertically-integrated approaches usually rely on homogeneous properties over the thickness of a geologic layer. Here, we investigate the impact of general (vertical and horizontal) heterogeneity in intrinsic permeability, relative permeability functions, and capillary pressure functions. We consider formations involving complex fluvial deposition environments and compare the performance of vertically-integrated models to full three-dimensional models for a set of hypothetical test cases consisting of high permeability channels (streams) embedded in a low permeability background (floodplains). The domains are randomly generated assuming that stream channels can be represented by sinusoidal waves in the plan-view and by parabolas for the streams' cross-sections. Stream parameters such as width, thickness and wavelength are based on values found at the Ketzin site in Germany. Results from the

  17. Integrating research tools to support the management of social-ecological systems under climate change

    Science.gov (United States)

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  18. Developing an integration tool for soil contamination assessment

    Science.gov (United States)

    Anaya-Romero, Maria; Zingg, Felix; Pérez-Álvarez, José Miguel; Madejón, Paula; Kotb Abd-Elmabod, Sameh

    2015-04-01

    In the last decades, huge soil areas have been negatively influenced or altered in multiples forms. Soils and, consequently, underground water, have been contaminated by accumulation of contaminants from agricultural activities (fertilizers and pesticides) industrial activities (harmful material dumping, sludge, flying ashes) and urban activities (hydrocarbon, metals from vehicle traffic, urban waste dumping). In the framework of the RECARE project, local partners across Europe are focusing on a wide range of soil threats, as soil contamination, and aiming to develop effective prevention, remediation and restoration measures by designing and applying targeted land management strategies (van Lynden et al., 2013). In this context, the Guadiamar Green Corridor (Southern Spain) was used as a case study, aiming to obtain soil data and new information in order to assess soil contamination. The main threat in the Guadiamar valley is soil contamination after a mine spill occurred on April 1998. About four hm3 of acid waters and two hm3 of mud, rich in heavy metals, were released into the Agrio and Guadiamar rivers affecting more than 4,600 ha of agricultural and pasture land. Main trace elements contaminating soil and water were As, Cd, Cu, Pb, Tl and Zn. The objective of the present research is to develop informatics tools that integrate soil database, models and interactive platforms for soil contamination assessment. Preliminary results were obtained related to the compilation of harmonized databases including geographical, hydro-meteorological, soil and socio-economic variables based on spatial analysis and stakeholder's consultation. Further research will be modellization and upscaling at the European level, in order to obtain a scientifically-technical predictive tool for the assessment of soil contamination.

  19. Experiences in applying Bayesian integrative models in interdisciplinary modeling: the computational and human challenges

    DEFF Research Database (Denmark)

    Kuikka, Sakari; Haapasaari, Päivi Elisabet; Helle, Inari

    2011-01-01

    We review the experience obtained in using integrative Bayesian models in interdisciplinary analysis focusing on sustainable use of marine resources and environmental management tasks. We have applied Bayesian models to both fisheries and environmental risk analysis problems. Bayesian belief...... be time consuming and research projects can be difficult to manage due to unpredictable technical problems related to parameter estimation. Biology, sociology and environmental economics have their own scientific traditions. Bayesian models are becoming traditional tools in fisheries biology, where...

  20. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    National Research Council Canada - National Science Library

    Petre, P; Visher, J; Shringarpure, R; Valley, F; Swaminathan, M

    2005-01-01

    Automated design tools and integrated design flow methodologies were developed that demonstrated more than an order- of-magnitude reduction in cycle time and cost for mixed signal (digital/analoglRF...

  1. A Quasiphysics Intelligent Model for a Long Range Fast Tool Servo

    Science.gov (United States)

    Liu, Qiang; Zhou, Xiaoqin; Lin, Jieqiong; Xu, Pengzi; Zhu, Zhiwei

    2013-01-01

    Accurately modeling the dynamic behaviors of fast tool servo (FTS) is one of the key issues in the ultraprecision positioning of the cutting tool. Herein, a quasiphysics intelligent model (QPIM) integrating a linear physics model (LPM) and a radial basis function (RBF) based neural model (NM) is developed to accurately describe the dynamic behaviors of a voice coil motor (VCM) actuated long range fast tool servo (LFTS). To identify the parameters of the LPM, a novel Opposition-based Self-adaptive Replacement Differential Evolution (OSaRDE) algorithm is proposed which has been proved to have a faster convergence mechanism without compromising with the quality of solution and outperform than similar evolution algorithms taken for consideration. The modeling errors of the LPM and the QPIM are investigated by experiments. The modeling error of the LPM presents an obvious trend component which is about ±1.15% of the full span range verifying the efficiency of the proposed OSaRDE algorithm for system identification. As for the QPIM, the trend component in the residual error of LPM can be well suppressed, and the error of the QPIM maintains noise level. All the results verify the efficiency and superiority of the proposed modeling and identification approaches. PMID:24163627

  2. An integrated knowledge-based and optimization tool for the sustainable selection of wastewater treatment process concepts

    DEFF Research Database (Denmark)

    Castillo, A.; Cheali, Peam; Gómez, V.

    2016-01-01

    The increasing demand on wastewater treatment plants (WWTPs) has involved an interest in improving the alternative treatment selection process. In this study, an integrated framework including an intelligent knowledge-based system and superstructure-based optimization has been developed and applied...... to a real case study. Hence, a multi-criteria analysis together with mathematical models is applied to generate a ranked short-list of feasible treatments for three different scenarios. Finally, the uncertainty analysis performed allows for increasing the quality and robustness of the decisions considering...... benefit and synergy is achieved when both tools are integrated because expert knowledge and expertise are considered together with mathematical models to select the most appropriate treatment alternative...

  3. Graph and model transformation tools for model migration : empirical results from the transformation tool contest

    NARCIS (Netherlands)

    Rose, L.M.; Herrmannsdoerfer, M.; Mazanek, S.; Van Gorp, P.M.E.; Buchwald, S.; Horn, T.; Kalnina, E.; Koch, A.; Lano, K.; Schätz, B.; Wimmer, M.

    2014-01-01

    We describe the results of the Transformation Tool Contest 2010 workshop, in which nine graph and model transformation tools were compared for specifying model migration. The model migration problem—migration of UML activity diagrams from version 1.4 to version 2.2—is non-trivial and practically

  4. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  5. Repository Integration Program: RIP performance assessment and strategy evaluation model theory manual and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    This report describes the theory and capabilities of RIP (Repository Integration Program). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ``big picture`` and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a `` top down`` approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ``top down`` approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers.

  6. Testing Predictive Models of Technology Integration in Mexico and the United States

    Science.gov (United States)

    Velazquez, Cesareo Morales

    2008-01-01

    Data from Mexico City, Mexico (N = 978) and from Texas, USA (N = 932) were used to test the predictive validity of the teacher professional development component of the Will, Skill, Tool Model of Technology Integration in a cross-cultural context. Structural equation modeling (SEM) was used to test the model. Analyses of these data yielded…

  7. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    's widely used in the world. Watershed models can be characterized by the high number of processes associated simulated. The estimation of these processes is also data intensive, requiring data on topography, land use / land cover, agriculture practices, soil type, precipitation, temperature, relative humidity, wind and radiation. Every year new data is being made available namely by satellite, that has allow to improve the quality of model input and also the calibration of the models (Galvão et. al, 2004b). Tools to cope with the vast amount of data have been developed: data formatting, data retrieving, data bases, metadata bases. The high number of processes simulated in watershed models makes them very wide in terms of output. The SWAT model outputs were modified to produce MOHID compliant result files (time series and HDF). These changes maintained the integrity of the original model, thus guarantying that results remain equal to the original version of SWAT. This allowed to output results in MOHID format, thus making it possible to immediately process it with MOHID visualization and data analysis tools (Chambel-Leitão et. al 2007; Trancoso et. al, 2009). Besides SWAT was modified to produce results files in HDF5 format, this allows the visualization of watershed properties (modeled by SWAT) in animated maps using MOHID GIS. The modified version of SWAT described here has been applied to various national and European projects. Results of the application of this modified version of SWAT to estimate hydrology and nutrients loads to estuaries and water bodies will be shown (Chambel-Leitão, 2008; Yarrow & Chambel-Leitão 2008; Chambel-Leitão et. al 2008; Yarrow & P. Chambel-Leitão, 2007; Yarrow & P. Chambel-Leitão, 2007; Coelho et. al., 2008). Keywords: Watershed models, SWAT, MOHID LAND, Hydrology, Nutrient Loads Arnold, J. G. and Fohrer, N. (2005). SWAT2000: current capabilities and research opportunities in applied watershed modeling. Hydrol. Process. 19, 563

  8. Molecular dynamics simulation of subnanometric tool-workpiece contact on a force sensor-integrated fast tool servo for ultra-precision microcutting

    International Nuclear Information System (INIS)

    Cai, Yindi; Chen, Yuan-Liu; Shimizu, Yuki; Ito, So; Gao, Wei; Zhang, Liangchi

    2016-01-01

    Highlights: • Subnanometric contact between a diamond tool and a copper workpiece surface is investigated by MD simulation. • A multi-relaxation time technique is proposed to eliminate the influence of the atom vibrations. • The accuracy of the elastic-plastic transition contact depth estimation is improved by observing the residual defects. • The simulation results are beneficial for optimization of the next-generation microcutting instruments. - Abstract: This paper investigates the contact characteristics between a copper workpiece and a diamond tool in a force sensor-integrated fast tool servo (FS-FTS) for single point diamond microcutting and in-process measurement of ultra-precision surface forms of the workpiece. Molecular dynamics (MD) simulations are carried out to identify the subnanometric elastic-plastic transition contact depth, at which the plastic deformation in the workpiece is initiated. This critical depth can be used to optimize the FS-FTS as well as the cutting/measurement process. It is clarified that the vibrations of the copper atoms in the MD model have a great influence on the subnanometric MD simulation results. A multi-relaxation time method is then proposed to reduce the influence of the atom vibrations based on the fact that the dominant vibration component has a certain period determined by the size of the MD model. It is also identified that for a subnanometric contact depth, the position of the tool tip for the contact force to be zero during the retracting operation of the tool does not correspond to the final depth of the permanent contact impression on the workpiece surface. The accuracy for identification of the transition contact depth is then improved by observing the residual defects on the workpiece surface after the tool retracting.

  9. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    Science.gov (United States)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  10. A review of computer tools for analysing the integration of renewable energy into various energy systems

    DEFF Research Database (Denmark)

    Connolly, D.; Lund, Henrik; Mathiesen, Brian Vad

    2010-01-01

    to integrating renewable energy, but instead the ‘ideal’ energy tool is highly dependent on the specific objectives that must be fulfilled. The typical applications for the 37 tools reviewed (from analysing single-building systems to national energy-systems), combined with numerous other factors......This paper includes a review of the different computer tools that can be used to analyse the integration of renewable energy. Initially 68 tools were considered, but 37 were included in the final analysis which was carried out in collaboration with the tool developers or recommended points...... of contact. The results in this paper provide the information necessary to identify a suitable energy tool for analysing the integration of renewable energy into various energy-systems under different objectives. It is evident from this paper that there is no energy tool that addresses all issues related...

  11. Integration of Web 2.0 Tools in Learning a Programming Course

    Science.gov (United States)

    Majid, Nazatul Aini Abd

    2014-01-01

    Web 2.0 tools are expected to assist students to acquire knowledge effectively in their university environment. However, the lack of effort from lecturers in planning the learning process can make it difficult for the students to optimize their learning experiences. The aim of this paper is to integrate Web 2.0 tools with learning strategy in…

  12. A model for flexible tools used in minimally invasive medical virtual environments.

    Science.gov (United States)

    Soler, Francisco; Luzon, M Victoria; Pop, Serban R; Hughes, Chris J; John, Nigel W; Torres, Juan Carlos

    2011-01-01

    Within the limits of current technology, many applications of a virtual environment will trade-off accuracy for speed. This is not an acceptable compromise in a medical training application where both are essential. Efficient algorithms must therefore be developed. The purpose of this project is the development and validation of a novel physics-based real time tool manipulation model, which is easy to integrate into any medical virtual environment that requires support for the insertion of long flexible tools into complex geometries. This encompasses medical specialities such as vascular interventional radiology, endoscopy, and laparoscopy, where training, prototyping of new instruments/tools and mission rehearsal can all be facilitated by using an immersive medical virtual environment. Our model recognises and uses accurately patient specific data and adapts to the geometrical complexity of the vessel in real time.

  13. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    Science.gov (United States)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  14. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    International Nuclear Information System (INIS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-01-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools

  15. The Parallel System for Integrating Impact Models and Sectors (pSIMS)

    Science.gov (United States)

    Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian

    2014-01-01

    We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.

  16. Integration between a sales support system and a simulation tool

    OpenAIRE

    Wahlström, Ola

    2005-01-01

    InstantPlanner is a sales support system for the material handling industry, visualizing and calculating designs faster and more correctly than other tools on the market. AutoMod is a world leading simulation tool used in the material handling industry to optimize and calculate appropriate configuration designs. Both applications are favorable in their own area provide a great platform for integration with the properties of fast designing, correct product calculations, great simulation capabi...

  17. Integration of eHealth Tools in the Process of Workplace Health Promotion: Proposal for Design and Implementation

    Science.gov (United States)

    2018-01-01

    Background Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. Objective To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. Methods We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. Results eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Conclusions Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the

  18. Implementing Case Tools in the Inteligent Telecommunication Systems

    Directory of Open Access Journals (Sweden)

    Bahador Ghahramani

    2003-02-01

    Full Text Available This paper discusses an intelligent and Internet-based Telecommunication System Specification Model (TSSM using Computer Aided Systems Engineering tools (CASE tools. TSSM implements CASE tools to mechanize its lifecycle development maintenance and integration process. This model is developed to improve the system analysts (SA efforts in their design and development of major software and hardware initiatives. This model also improves the SA effectiveness by guiding them through the system's Lifecycle Development Process (LDP. The CASE tools are used to support, integrate, and monitor all LDP functions of the system.

  19. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    Energy Technology Data Exchange (ETDEWEB)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  20. Upgrade and integration of the configuration and monitoring tools for the ATLAS Online farm

    CERN Document Server

    Ballestrero, S; The ATLAS collaboration; Darlea, G L; Dumitru, I; Scannicchio, DA; Twomey, M S; Valsan, M L; Zaytsev, A

    2012-01-01

    The ATLAS Online farm is a non-homogeneous cluster of nearly 3000 PCs which run the data acquisition, trigger and control of the ATLAS detector. The systems are configured and monitored by a combination of open-source tools, such as Quattor and Nagios, and tools developed in-house, such as ConfDB. We report on the ongoing introduction of new provisioning and configuration tools, Puppet and ConfDB v2 which are more flexible and allow automation for previously uncovered needs, and on the upgrade and integration of the monitoring and alerting tools, including the interfacing of these with the TDAQ Shifter Assistant software and their integration with configuration tools. We discuss the selection of the tools and the assessment of their functionality and performance, and how they enabled the introduction of virtualization for selected services.

  1. Upgrade and integration of the configuration and monitoring tools for the ATLAS Online farm

    International Nuclear Information System (INIS)

    Ballestrero, S; Darlea, G–L; Twomey, M S; Brasolin, F; Dumitru, I; Valsan, M L; Scannicchio, D A; Zaytsev, A

    2012-01-01

    The ATLAS Online farm is a non-homogeneous cluster of nearly 3000 systems which run the data acquisition, trigger and control of the ATLAS detector. The systems are configured and monitored by a combination of open-source tools, such as Quattor and Nagios, and tools developed in-house, such as ConfDB. We report on the ongoing introduction of new provisioning and configuration tools, Puppet and ConfDB v2, which are more flexible and allow automation for previously uncovered needs, and on the upgrade and integration of the monitoring and alerting tools, including the interfacing of these with the TDAQ Shifter Assistant software and their integration with configuration tools. We discuss the selection of the tools and the assessment of their functionality and performance, and how they enabled the introduction of virtualization for selected services.

  2. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  3. Teaching Students How to Integrate and Assess Social Networking Tools in Marketing Communications

    Science.gov (United States)

    Schlee, Regina Pefanis; Harich, Katrin R.

    2013-01-01

    This research is based on two studies that focus on teaching students how to integrate and assess social networking tools in marketing communications. Study 1 examines how students in marketing classes utilize social networking tools and explores their attitudes regarding the use of such tools for marketing communications. Study 2 focuses on an…

  4. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    1998-01-01

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  5. Integrated Baseline System (IBS) Version 2.0: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  6. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  7. Geoinformation Systems as a Tool of the Integrated Tourist Spaces Management

    Directory of Open Access Journals (Sweden)

    Kolesnikovich Victor

    2014-09-01

    Full Text Available Introduction. Currently tourist activity management is in need of creating special conditions for the development of integrated management tools based on the general information and analytical base. Material and methods. The creation of architecture and the content of geoinformation and hybrid information systems are oriented at the usage of the Integrated Tourist Spaces Management (ITSM to set up a specific claim related to the features of management model. The authors created the concept of tourist space. The information and the analytical system are used to create the information model of tourist space. Information support development of ITSM system is a sort of a hybrid system: an expert system constructed on the basis of GIS. Results and conclusions. By means of GIS collecting, storage, analysis and graphic visualization of spatial data and the related information on the objects presented in an expert system is provided. The offered approach leads to the formation of an information system and the analytical maintenance of not only human decision-making, but it also promotes the creation of new tourist products based on more and more differentiated inquiries of clients or a ratio of the price and quality (from the point of view of satisfaction of inquiries.

  8. Six sigma tools in integrating internal operations of a retail pharmacy: a case study.

    Science.gov (United States)

    Kumar, Sameer; Kwong, Anthony M

    2011-01-01

    This study was initiated to integrate information and enterprise-wide healthcare delivery system issues specifically within an inpatient retail pharmacy operation in a U.S. community hospital. Six Sigma tools were used to examine the effects to an inpatient retail pharmacy service process. Some of the tools used include service blueprints, cause-effect diagram, gap analysis derived from customer and employee surveys, mistake proofing was applied in various business situations and results were analyzed to identify and propose process improvements and integration. The research indicates that the Six Sigma tools in this discussion are very applicable and quite effective in helping to streamline and integrate the pharmacy process flow. Additionally, gap analysis derived from two different surveys was used to estimate the primary areas of focus to increase customer and employee satisfaction. The results of this analysis were useful in initiating discussions of how to effectively narrow these service gaps. This retail pharmaceutical service study serves as a framework for the process that should occur for successful process improvement tool evaluation and implementation. Pharmaceutical Service operations in the U.S. that use this integration framework must tailor it to their individual situations to maximize their chances for success.

  9. Integrated environmental decision support tool based on GIS technology

    International Nuclear Information System (INIS)

    Doctor, P.G.; O'Neil, T.K.; Sackschewsky, M.R.; Becker, J.M.; Rykiel, E.J.; Walters, T.B.; Brandt, C.A.; Hall, J.A.

    1995-01-01

    Environmental restoration and management decisions facing the US Department of Energy require balancing trade-offs between diverse land uses and impacts over multiple spatial and temporal scales. Many types of environmental data have been collected for the Hanford Site and the Columbia River in Washington State over the past fifty years. Pacific Northwest National Laboratory (PNNL) is integrating these data into a Geographic Information System (GIS) based computer decision support tool. This tool provides a comprehensive and concise description of the current environmental landscape that can be used to evaluate the ecological and monetary trade-offs between future land use, restoration and remediation options before action is taken. Ecological impacts evaluated include effects to individual species of concern and habitat loss and fragmentation. Monetary impacts include those associated with habitat mitigation. The tool is organized as both a browsing tool for educational purposes, and as a framework that leads a project manager through the steps needed to be in compliance with environmental requirements

  10. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  11. Modelling future impacts of air pollution using the multi-scale UK Integrated Assessment Model (UKIAM).

    Science.gov (United States)

    Oxley, Tim; Dore, Anthony J; ApSimon, Helen; Hall, Jane; Kryza, Maciej

    2013-11-01

    Integrated assessment modelling has evolved to support policy development in relation to air pollutants and greenhouse gases by providing integrated simulation tools able to produce quick and realistic representations of emission scenarios and their environmental impacts without the need to re-run complex atmospheric dispersion models. The UK Integrated Assessment Model (UKIAM) has been developed to investigate strategies for reducing UK emissions by bringing together information on projected UK emissions of SO2, NOx, NH3, PM10 and PM2.5, atmospheric dispersion, criteria for protection of ecosystems, urban air quality and human health, and data on potential abatement measures to reduce emissions, which may subsequently be linked to associated analyses of costs and benefits. We describe the multi-scale model structure ranging from continental to roadside, UK emission sources, atmospheric dispersion of emissions, implementation of abatement measures, integration with European-scale modelling, and environmental impacts. The model generates outputs from a national perspective which are used to evaluate alternative strategies in relation to emissions, deposition patterns, air quality metrics and ecosystem critical load exceedance. We present a selection of scenarios in relation to the 2020 Business-As-Usual projections and identify potential further reductions beyond those currently being planned. © 2013.

  12. epsilon : A tool to find a canonical basis of master integrals

    Science.gov (United States)

    Prausa, Mario

    2017-10-01

    In 2013, Henn proposed a special basis for a certain class of master integrals, which are expressible in terms of iterated integrals. In this basis, the master integrals obey a differential equation, where the right hand side is proportional to ɛ in d = 4 - 2 ɛ space-time dimensions. An algorithmic approach to find such a basis was found by Lee. We present the tool epsilon, an efficient implementation of Lee's algorithm based on the Fermat computer algebra system as computational back end.

  13. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  14. Simulation Tools and Techniques for Analyzing the Impacts of Photovoltaic System Integration

    Science.gov (United States)

    Hariri, Ali

    Solar photovoltaic (PV) energy integration in distribution networks is one of the fastest growing sectors of distributed energy integration. The growth in solar PV integration is incentivized by various clean power policies, global interest in solar energy, and reduction in manufacturing and installation costs of solar energy systems. The increase in solar PV integration has raised a number of concerns regarding the potential impacts that might arise as a result of high PV penetration. Some impacts have already been recorded in networks with high PV penetration such as in China, Germany, and USA (Hawaii and California). Therefore, network planning is becoming more intricate as new technologies are integrated into the existing electric grid. The integrated new technologies pose certain compatibility concerns regarding the existing electric grid infrastructure. Therefore, PV integration impact studies are becoming more essential in order to have a better understanding of how to advance the solar PV integration efforts without introducing adverse impacts into the network. PV impact studies are important for understanding the nature of the new introduced phenomena. Understanding the nature of the potential impacts is a key factor for mitigating and accommodating for said impacts. Traditionally, electric power utilities relied on phasor-based power flow simulations for planning their electric networks. However, the conventional, commercially available, phasor-based simulation tools do not provide proper visibility across a wide spectrum of electric phenomena. Moreover, different types of simulation approaches are suitable for specific types of studies. For instance, power flow software cannot be used for studying time varying phenomena. At the same time, it is not practical to use electromagnetic transient (EMT) tools to perform power flow solutions. Therefore, some electric phenomena caused by the variability of PV generation are not visible using the conventional

  15. Non-communicable diseases and HIV care and treatment: models of integrated service delivery.

    Science.gov (United States)

    Duffy, Malia; Ojikutu, Bisola; Andrian, Soa; Sohng, Elaine; Minior, Thomas; Hirschhorn, Lisa R

    2017-08-01

    Non-communicable diseases (NCD) are a growing cause of morbidity in low-income countries including in people living with human immunodeficiency virus (HIV). Integration of NCD and HIV services can build upon experience with chronic care models from HIV programmes. We describe models of NCD and HIV integration, challenges and lessons learned. A literature review of published articles on integrated NCD and HIV programs in low-income countries and key informant interviews were conducted with leaders of identified integrated NCD and HIV programs. Information was synthesised to identify models of NCD and HIV service delivery integration. Three models of integration were identified as follows: NCD services integrated into centres originally providing HIV care; HIV care integrated into primary health care (PHC) already offering NCD services; and simultaneous introduction of integrated HIV and NCD services. Major challenges identified included NCD supply chain, human resources, referral systems, patient education, stigma, patient records and monitoring and evaluation. The range of HIV and NCD services varied widely within and across models. Regardless of model of integration, leveraging experience from HIV care models and adapting existing systems and tools is a feasible method to provide efficient care and treatment for the growing numbers of patients with NCDs. Operational research should be conducted to further study how successful models of HIV and NCD integration can be expanded in scope and scaled-up by managers and policymakers seeking to address all the chronic care needs of their patients. © 2017 John Wiley & Sons Ltd.

  16. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Burr, Tom; Gorensek, M.B.; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclearnonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facilitymodeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facilitymodeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facilitymodelingcapabilities and illustrates how they could be integrated and utilized for nonproliferationanalysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facilitymodeling tools. After considering a representative sampling of key facilitymodelingcapabilities, the proposed integration framework is illustrated with several examples.

  17. Integration of Linear Dynamic Emission and Climate Models with Air Traffic Simulations

    Science.gov (United States)

    Sridhar, Banavar; Ng, Hok K.; Chen, Neil Y.

    2012-01-01

    Future air traffic management systems are required to balance the conflicting objectives of maximizing safety and efficiency of traffic flows while minimizing the climate impact of aviation emissions and contrails. Integrating emission and climate models together with air traffic simulations improve the understanding of the complex interaction between the physical climate system, carbon and other greenhouse gas emissions and aviation activity. This paper integrates a national-level air traffic simulation and optimization capability with simple climate models and carbon cycle models, and climate metrics to assess the impact of aviation on climate. The capability can be used to make trade-offs between extra fuel cost and reduction in global surface temperature change. The parameters in the simulation can be used to evaluate the effect of various uncertainties in emission models and contrails and the impact of different decision horizons. Alternatively, the optimization results from the simulation can be used as inputs to other tools that monetize global climate impacts like the FAA s Aviation Environmental Portfolio Management Tool for Impacts.

  18. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  19. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  20. Integrated Surface/subsurface flow modeling in PFLOTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Painter, Scott L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    Understanding soil water, groundwater, and shallow surface water dynamics as an integrated hydrological system is critical for understanding the Earth’s critical zone, the thin outer layer at our planet’s surface where vegetation, soil, rock, and gases interact to regulate the environment. Computational tools that take this view of soil moisture and shallow surface flows as a single integrated system are typically referred to as integrated surface/subsurface hydrology models. We extend the open-source, highly parallel, subsurface flow and reactive transport simulator PFLOTRAN to accommodate surface flows. In contrast to most previous implementations, we do not represent a distinct surface system. Instead, the vertical gradient in hydraulic head at the land surface is neglected, which allows the surface flow system to be eliminated and incorporated directly into the subsurface system. This tight coupling approach leads to a robust capability and also greatly simplifies implementation in existing subsurface simulators such as PFLOTRAN. Successful comparisons to independent numerical solutions build confidence in the approximation and implementation. Example simulations of the Walker Branch and East Fork Poplar Creek watersheds near Oak Ridge, Tennessee demonstrate the robustness of the approach in geometrically complex applications. The lack of a robust integrated surface/subsurface hydrology capability had been a barrier to PFLOTRAN’s use in critical zone studies. This work addresses that capability gap, thus enabling PFLOTRAN as a community platform for building integrated models of the critical zone.

  1. West-Life, Tools for Integrative Structural Biology

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Structural biology is part of molecular biology focusing on determining structure of macromolecules inside living cells and cell membranes. As macromolecules determines most of the functions of cells the structural knowledge is very useful for further research in metabolism, physiology to application in pharmacology etc. As macromolecules are too small to be observed directly by light microscope, there are other methods used to determine the structure including nuclear magnetic resonance (NMR), X-Ray crystalography, cryo electron microscopy and others. Each method has it's advantages and disadvantages in the terms of availability, sample preparation, resolution. West-Life project has ambition to facilitate integrative approach using multiple techniques mentioned above. As there are already lot of software tools to process data produced by the techniques above, the challenge is to integrate them together in a way they can be used by experts in one technique but not experts in other techniques. One product ...

  2. Modeling energy-economy interactions using integrated models

    International Nuclear Information System (INIS)

    Uyterlinde, M.A.

    1994-06-01

    Integrated models are defined as economic energy models that consist of several submodels, either coupled by an interface module, or embedded in one large model. These models can be used for energy policy analysis. Using integrated models yields the following benefits. They provide a framework in which energy-economy interactions can be better analyzed than in stand-alone models. Integrated models can represent both energy sector technological details, as well as the behaviour of the market and the role of prices. Furthermore, the combination of modeling methodologies in one model can compensate weaknesses of one approach with strengths of another. These advantages motivated this survey of the class of integrated models. The purpose of this literature survey therefore was to collect and to present information on integrated models. To carry out this task, several goals were identified. The first goal was to give an overview of what is reported on these models in general. The second one was to find and describe examples of such models. Other goals were to find out what kinds of models were used as component models, and to examine the linkage methodology. Solution methods and their convergence properties were also a subject of interest. The report has the following structure. In chapter 2, a 'conceptual framework' is given. In chapter 3 a number of integrated models is described. In a table, a complete overview is presented of all described models. Finally, in chapter 4, the report is summarized, and conclusions are drawn regarding the advantages and drawbacks of integrated models. 8 figs., 29 refs

  3. Modelling and monitoring of integrated urban wastewater systems: review on status and perspectives

    DEFF Research Database (Denmark)

    Benedetti, L.; Langeveld, J.; Comeau, A.

    2013-01-01

    been investigated and several new or improved systems analysis methods have become available. New/improved software tools coupled with the current high computational capacity have enabled the application of integrated modelling to several practical cases, and advancements in monitoring water quantity...... and quality have been substantial and now allow the collecting of data in sufficient quality and quantity to permit using integrated models for real-time applications too. Further developments are warranted in the field of data quality assurance and efficient maintenance....

  4. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  5. Development of the ECLSS Sizing Analysis Tool and ARS Mass Balance Model Using Microsoft Excel

    Science.gov (United States)

    McGlothlin, E. P.; Yeh, H. Y.; Lin, C. H.

    1999-01-01

    The development of a Microsoft Excel-compatible Environmental Control and Life Support System (ECLSS) sizing analysis "tool" for conceptual design of Mars human exploration missions makes it possible for a user to choose a certain technology in the corresponding subsystem. This tool estimates the mass, volume, and power requirements of every technology in a subsystem and the system as a whole. Furthermore, to verify that a design sized by the ECLSS Sizing Tool meets the mission requirements and integrates properly, mass balance models that solve for component throughputs of such ECLSS systems as the Water Recovery System (WRS) and Air Revitalization System (ARS) must be developed. The ARS Mass Balance Model will be discussed in this paper.

  6. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    Science.gov (United States)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  7. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  8. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  9. Integrated Reporting as a Tool for Communicating with Stakeholders - Advantages and Disadvantages

    Science.gov (United States)

    Matuszyk, Iwona; Rymkiewicz, Bartosz

    2018-03-01

    Financial and non-financial reporting from the beginning of its existence is the primary source of communication between the company and a wide range of stakeholders. Over the decades it has adapted to the needs of rapidly changing business and social environment. Currently, the final link in the evolution of organizational reporting, such as integrated reporting, assumes integration and mutual connectivity to both financial and non-financial data. The main interest in the concept of integrated reporting comes from the value it contributes to the organization. Undoubtedly, the concept of integrated reporting is a milestone in the evolution of organizational reporting. It is however important to consider whether it adequately addresses the information needs of a wide range of stakeholders, and whether it is a universal tool for communication between the company and its stakeholders. The aim of the paper is to discuss the advantages and disadvantages of the concept of integrated reporting as a tool for communication with stakeholders and to further directions of its development. The article uses the research methods such as literature analysis, the content analysis of the corporate publications and comparative analysis.

  10. Application of an Integrated Assessment Model to the Kevin Dome site, Montana

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Minh [Univ. of Wyoming, Laramie, WY (United States); Zhang, Ye [Univ. of Wyoming, Laramie, WY (United States); Carey, James William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stauffer, Philip H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-30

    The objectives of the Integrated Assessment Model is to enable the Fault Swarm algorithm in the National Risk Assessment Partnership, ensure faults are working in the NRAP-IAM tool, calculate hypothetical fault leakage in NRAP-IAM, and compare leakage rates to Eclipse simulations.

  11. Modeling for Integrated Science Management and Resilient Systems Development

    Science.gov (United States)

    Shelhamer, M.; Mindock, J.; Lumpkins, S.

    2014-01-01

    Many physiological, environmental, and operational risks exist for crewmembers during spaceflight. An understanding of these risks from an integrated perspective is required to provide effective and efficient mitigations during future exploration missions that typically have stringent limitations on resources available, such as mass, power, and crew time. The Human Research Program (HRP) is in the early stages of developing collaborative modeling approaches for the purposes of managing its science portfolio in an integrated manner to support cross-disciplinary risk mitigation strategies and to enable resilient human and engineered systems in the spaceflight environment. In this talk, we will share ideas being explored from fields such as network science, complexity theory, and system-of-systems modeling. Initial work on tools to support these explorations will be discussed briefly, along with ideas for future efforts.

  12. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    Science.gov (United States)

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  13. Using registries to integrate bioinformatics tools and services into workbench environments

    DEFF Research Database (Denmark)

    Ménager, Hervé; Kalaš, Matúš; Rapacki, Kristoffer

    2016-01-01

    The diversity and complexity of bioinformatics resources presents significant challenges to their localisation, deployment and use, creating a need for reliable systems that address these issues. Meanwhile, users demand increasingly usable and integrated ways to access and analyse data, especially......, a software component that will ease the integration of bioinformatics resources in a workbench environment, using their description provided by the existing ELIXIR Tools and Data Services Registry....

  14. SEPHYDRO: An Integrated Multi-Filter Web-Based Tool for Baseflow Separation

    Science.gov (United States)

    Serban, D.; MacQuarrie, K. T. B.; Popa, A.

    2017-12-01

    Knowledge of baseflow contributions to streamflow is important for understanding watershed scale hydrology, including groundwater-surface water interactions, impact of geology and landforms on baseflow, estimation of groundwater recharge rates, etc. Baseflow (or hydrograph) separation methods can be used as supporting tools in many areas of environmental research, such as the assessment of the impact of agricultural practices, urbanization and climate change on surface water and groundwater. Over the past few decades various digital filtering and graphically-based methods have been developed in an attempt to improve the assessment of the dynamics of the various sources of streamflow (e.g. groundwater, surface runoff, subsurface flow); however, these methods are not available under an integrated platform and, individually, often require significant effort for implementation. Here we introduce SEPHYDRO, an open access, customizable web-based tool, which integrates 11 algorithms allowing for separation of streamflow hydrographs. The streamlined interface incorporates a reference guide as well as additional information that allows users to import their own data, customize the algorithms, and compare, visualise and export results. The tool includes one-, two- and three-parameter digital filters as well as graphical separation methods and has been successfully applied in Atlantic Canada, in studies dealing with nutrient loading to fresh water and coastal water ecosystems. Future developments include integration of additional separation algorithms as well as incorporation of geochemical separation methods. SEPHYDRO has been developed through a collaborative research effort between the Canadian Rivers Institute, University of New Brunswick (Fredericton, New Brunswick, Canada), Agriculture and Agri-Food Canada and Environment and Climate Change Canada and is currently available at http://canadianriversinstitute.com/tool/

  15. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  16. Toward a synthetic economic systems modeling tool for sustainable exploitation of ecosystems.

    Science.gov (United States)

    Richardson, Colin; Courvisanos, Jerry; Crawford, John W

    2011-02-01

    Environmental resources that underpin the basic human needs of water, energy, and food are predicted to become in such short supply by 2050 that global security and the well-being of millions will be under threat. These natural commodities have been allowed to reach crisis levels of supply because of a failure of economic systems to sustain them. This is largely because there have been no means of integrating their exploitation into any economic model that effectively addresses ecological systemic failures in a way that provides an integrated ecological-economic tool that can monitor and evaluate market and policy targets. We review the reasons for this and recent attempts to address the problem while identifying outstanding issues. The key elements of a policy-oriented economic model that integrates ecosystem processes are described and form the basis of a proposed new synthesis approach. The approach is illustrated by an indicative case study that develops a simple model for rainfed and irrigated food production in the Murray-Darling basin of southeastern Australia. © 2011 New York Academy of Sciences.

  17. Digital Aquifer - Integrating modeling, technical, software and policy aspects to develop a groundwater management tool

    Science.gov (United States)

    Tirupathi, S.; McKenna, S. A.; Fleming, K.; Wambua, M.; Waweru, P.; Ondula, E.

    2016-12-01

    Groundwater management has traditionally been observed as a study for long term policy measures to ensure that the water resource is sustainable. IBM Research, in association with the World Bank, extended this traditional analysis to include realtime groundwater management by building a context-aware, water rights management and permitting system. As part of this effort, one of the primary objectives was to develop a groundwater flow model that can help the policy makers with a visual overview of the current groundwater distribution. In addition, the system helps the policy makers simulate a range of scenarios and check the sustainability of the groundwater resource in a given region. The system also enables a license provider to check the effect of the introduction of a new well on the existing wells in the domain as well as the groundwater resource in general. This process simplifies how an engineer will determine if a new well should be approved. Distance to the nearest well neighbors and the maximum decreases in water levels of nearby wells are continually assessed and presented as evidence for an engineer to make the final judgment on approving the permit. The system also facilitates updated insights on the amount of groundwater left in an area and provides advice on how water fees should be structured to balance conservation and economic development goals. In this talk, we will discuss the concept of Digital Aquifer, the challenges in integrating modeling, technical and software aspects to develop a management system that helps policy makers and license providers with a robust decision making tool. We will concentrate on the groundwater model developed using the analytic element method that plays a very important role in the decision making aspects. Finally, the efficiency of this system and methodology is shown through a case study in Laguna Province, Philippines, which was done in collaboration with the National Water Resource Board, Philippines and World

  18. Integrating FMEA in a Model-Driven Methodology

    Science.gov (United States)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno

    2016-08-01

    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  19. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  20. Integration of distributed system simulation tools for a holistic approach to integrated building and system design

    NARCIS (Netherlands)

    Radosevic, M.; Hensen, J.L.M.; Wijsman, A.J.T.M.; Hensen, J.L.M.; Lain, M.

    2004-01-01

    Advanced architectural developments require an integrated approach to design where simulation tools available today deal. only with a small subset of the overall problem. The aim of this study is to enable run time exchange of necessary data at suitable frequency between different simulation

  1. Eco-hydrological process simulations within an integrated surface water-groundwater model

    DEFF Research Database (Denmark)

    Butts, Michael; Loinaz, Maria Christina; Bauer-Gottwein, Peter

    2014-01-01

    Integrated water resources management requires tools that can quantify changes in groundwater, surface water, water quality and ecosystem health, as a result of changes in catchment management. To address these requirements we have developed an integrated eco-hydrological modelling framework...... that allows hydrologists and ecologists to represent the complex and dynamic interactions occurring between surface water, ground water, water quality and freshwater ecosystems within a catchment. We demonstrate here the practical application of this tool to two case studies where the interaction of surface...... water and ground water are important for the ecosystem. In the first, simulations are performed to understand the importance of surface water-groundwater interactions for a restored riparian wetland on the Odense River in Denmark as part of a larger investigation of water quality and nitrate retention...

  2. Metadata and Tools for Integration and Preservation of Cultural Heritage 3D Information

    Directory of Open Access Journals (Sweden)

    Achille Felicetti

    2011-12-01

    Full Text Available In this paper we investigate many of the various storage, portability and interoperability issues arising among archaeologists and cultural heritage people when dealing with 3D technologies. On the one side, the available digital repositories look often unable to guarantee affordable features in the management of 3D models and their metadata; on the other side the nature of most of the available data format for 3D encoding seem to be not satisfactory for the necessary portability required nowadays by 3D information across different systems. We propose a set of possible solutions to show how integration can be achieved through the use of well known and wide accepted standards for data encoding and data storage. Using a set of 3D models acquired during various archaeological campaigns and a number of open source tools, we have implemented a straightforward encoding process to generate meaningful semantic data and metadata. We will also present the interoperability process carried out to integrate the encoded 3D models and the geographic features produced by the archaeologists. Finally we will report the preliminary (rather encouraging development of a semantic enabled and persistent digital repository, where 3D models (but also any kind of digital data and metadata can easily be stored, retrieved and shared with the content of other digital archives.

  3. Tools of integration of innovation-oriented machine-building enterprises in industrial park environment

    Directory of Open Access Journals (Sweden)

    К.О. Boiarynova

    2017-08-01

    Full Text Available The research is devoted to the development of the tools for the integration of innovation-oriented mechanical engineering enterprises into the environment of industrial park as functional economic systems, which are capable on the own development basis to provide the development of resident enterprises. The article analyzes the opportunities for the development of mechanical engineering enterprises. The formed structure of the mechanism of integration of mechanical engineering enterprises as functional economic systems into the industrial park environment is based on: 1 the development of participation programs in the industrial park of the mechanical engineering enterprises as an innovation-oriented partner, which foresees the development of the enterprise immediately and the development of other residents; 2 the provision of high-tech equipment of resident enterprises of industrial parks; 3 the creation of subsidiary-spin-out enterprises of large mechanical engineering enterprises for high-tech production in the industrial park. The author proposes the road map that reveals the procedures for the integration and functioning the investigated enterprises through interaction as well as in the ecosystem of the industrial park and in the general ecosystem of functioning, and the tools for providing economic functionality through economic and organizational proceedings at preventive, partner and resident phases of integration. The tools allow the innovation-oriented mechanical engineering enterprises to integrate into such territorial structures as industrial parks, this in complex will allow carrying out their purposes in the development of the real sector of the economy.

  4. Models of Russia's Participation in Regional Economic Integration

    Directory of Open Access Journals (Sweden)

    Darya I. Ushkalova

    2014-01-01

    Full Text Available The article analyses models and mechanisms of Russia's participation in integration processes in Post-Soviet space in recent years. The article examines the model of integration of Customs Union Common Economic Space Eurasian Economic Union and particular mechanisms of its realization. It also examines key challenges to further development of integration in the frameworks of Eurasian Economic Union including exhausting of short-term and medium-term integration effects against a background of low level of economic cooperation and the lack of effective mechanism of interest coordination and decisionmaking similar to qualified majority. It concludes that deterioration of mutual trade dynamics in Customs Union is determined by fundamental factors, first of all, exhausting of medium-term integration effects which lead to extension of mutual trade immediately after Customs Union creation but do not change its qualitative characteristics in long-term outlook. The author shows an absence of significant long-term integration effects which were based on increase of domestic market capacity due to a modification of economic structure. It is founded that appearance of such long-term integration effects is possible only in the context of coalescence of national economies at the microlevel based on development of system of communications between enterprises including intrasectoral industrial cooperation. The article also analyses results of realization of Russia's strategy of interaction with states beyond Eurasian Economic Union based on open regionalism concept. The paper presents recommendation on perfection of tools of integration in and outside Eurasian Economic Union. In particular, creation of system of decentralized organizations is proposed, for the implementation of specific cooperation projects in selected areas, taking into account the multiplier effect of such a "point-aimed" action/

  5. Development of integrated systems dynamics models for the sustainability assessment of nuclear energy

    International Nuclear Information System (INIS)

    Van Den Durpel, Luc; Yacout, Abdellatif; Wade, Dave

    2005-01-01

    Nuclear energy is increasingly perceived as an attractive mature energy generation technology that can deliver an answer to the worldwide increasing energy demand while respecting environmental concerns as well as contributing to a reduced dependence on fossil fuel. Advancing nuclear energy deployment demands an assessment of nuclear energy with respect to all sustainability dimensions allowing full stakeholder involvement in deciding on the role of nuclear energy as part of a sustainable energy generation mix in the future. Integrated system dynamics models of nuclear energy systems are interesting tools for such assessment studies allowing performing material flow accounting, environmental impact, economic competitiveness and socio-political analysis and this for time-evolving nuclear energy systems. No single tool today is capable of covering all the dimensions for such integrated assessment while various developments are ongoing in different places around the world to make such tools available in the nearby future. Argonne National Laboratory has embarked on such tool development since the year 2000 and has developed various tools among which the DANESS-code shall be described in some more detail in this paper. (author)

  6. Processing: A Python Framework for the Seamless Integration of Geoprocessing Tools in QGIS

    Directory of Open Access Journals (Sweden)

    Anita Graser

    2015-10-01

    Full Text Available Processing is an object-oriented Python framework for the popular open source Geographic Information System QGIS, which provides a seamless integration of geoprocessing tools from a variety of different software libraries. In this paper, we present the development history, software architecture and features of the Processing framework, which make it a versatile tool for the development of geoprocessing algorithms and workflows, as well as an efficient integration platform for algorithms from different sources. Using real-world application examples, we furthermore illustrate how the Processing architecture enables typical geoprocessing use cases in research and development, such as automating and documenting workflows, combining algorithms from different software libraries, as well as developing and integrating custom algorithms. Finally, we discuss how Processing can facilitate reproducible research and provide an outlook towards future development goals.

  7. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  8. Electricity market models and RES integration: The Greek case

    International Nuclear Information System (INIS)

    Simoglou, Christos K.; Biskas, Pandelis N.; Vagropoulos, Stylianos I.; Bakirtzis, Anastasios G.

    2014-01-01

    This paper presents an extensive analysis of the Greek electricity market for the next 7-year period (2014–2020) based on an hour-by-hour simulation considering five different RES technologies, namely wind, PV, small hydro, biomass and CHP with emphasis on PV integration. The impact of RES penetration on the electricity market operation is evaluated under two different models regarding the organization of the Greek wholesale day-ahead electricity market: a mandatory power pool for year 2014 (current market design) and a power exchange for the period 2015–2020 (Target Model). An integrated software tool is used for the simulation of the current and the future day-ahead market clearing algorithm of the Greek wholesale electricity market. Simulation results indicate the impact of the anticipated large-scale RES integration, in conjunction with each market model, on specific indicators of the Greek electricity market in the long-term. - Highlights: • Analysis of the Greek electricity market for the next 7-year period (2014–2020) based on hour-by-hour simulation. • Five different RES technologies are considered with emphasis on PV integration. • A power pool (for 2014) and a power exchange (for 2015–2020) are considered. • Various market indicators are used for the analysis of the impact of the RES integration on the Greek electricity market. • Two alternative tariff schemes for the compensation of the new ground-mounted PV units from 2015 onwards are investigated

  9. Newly developed integrated model to reduce risks in the electricity market

    International Nuclear Information System (INIS)

    Mo, Birger

    2001-01-01

    A new model which integrates hydro-scheduling and financial hedging has been developed in cooperation with Norsk Hydro. We believe the new tool will be useful for owners of hydropower plants that want to reduce risks in the power market. The model development started in 1997 and was financed by Norsk Hydro. As of 1998, the main financial contributor has been the Research Council of Norway through a project in the Strategic Institute Programme. (author)

  10. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report. Version 1.0

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S.; Kramer White, Julie; Labbe, Steve G.; Rotter, Hank A.

    2005-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments, and real-time on-orbit assessments. The tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  11. Evaluating stormwater micropollutant control strategies by the application of an integrated model

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Sharma, Anitha Kumari; Ledin, Anna

    2011-01-01

    and enhancement of existing treatment) for reducing heavy metals (copper, zinc) and organic MP (fluoranthene). The runoff quality model showed high uncertainty, with prediction bounds strongly affected by the exceptionally high measured concentrations. The model quantified the greater benefits of the source......The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental task to enable the elaboration of strategies to reduce stormwater MP discharge to natural waters. Dynamic models can represent important tools which can integrate the limited data provided by monitoring campaigns....... This study presents an application of an integrated dynamic model to estimate MP fluxes in stormwater systems in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data. Runoff quality was simulated by using a conceptual accumulation/washoff model...

  12. Contribution to the study of conformal theories and integrable models

    International Nuclear Information System (INIS)

    Sochen, N.

    1992-05-01

    The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved

  13. Modelling tools for managing Induced RiverBank Filtration MAR schemes

    Science.gov (United States)

    De Filippis, Giovanna; Barbagli, Alessio; Marchina, Chiara; Borsi, Iacopo; Mazzanti, Giorgio; Nardi, Marco; Vienken, Thomas; Bonari, Enrico; Rossetto, Rudy

    2017-04-01

    Induced RiverBank Filtration (IRBF) is a widely used technique in Managed Aquifer Recharge (MAR) schemes, when aquifers are hydraulically connected with surface water bodies, with proven positive effects on quality and quantity of groundwater. IRBF allows abstraction of a large volume of water, avoiding large decrease in groundwater heads. Moreover, thanks to the filtration process through the soil, the concentration of chemical species in surface water can be reduced, thus becoming an excellent resource for the production of drinking water. Within the FP7 MARSOL project (demonstrating Managed Aquifer Recharge as a SOLution to water scarcity and drought; http://www.marsol.eu/), the Sant'Alessio IRBF (Lucca, Italy) was used to demonstrate the feasibility and technical and economic benefits of managing IRBF schemes (Rossetto et al., 2015a). The Sant'Alessio IRBF along the Serchio river allows to abstract an overall amount of about 0.5 m3/s providing drinking water for 300000 people of the coastal Tuscany (mainly to the town of Lucca, Pisa and Livorno). The supplied water is made available by enhancing river bank infiltration into a high yield (10-2 m2/s transmissivity) sandy-gravelly aquifer by rising the river head and using ten vertical wells along the river embankment. A Decision Support System, consisting in connected measurements from an advanced monitoring network and modelling tools was set up to manage the IRBF. The modelling system is based on spatially distributed and physically based coupled ground-/surface-water flow and solute transport models integrated in the FREEWAT platform (developed within the H2020 FREEWAT project - FREE and Open Source Software Tools for WATer Resource Management; Rossetto et al., 2015b), an open source and public domain GIS-integrated modelling environment for the simulation of the hydrological cycle. The platform aims at improving water resource management by simplifying the application of EU water-related Directives and at

  14. Clarity versus complexity: land-use modeling as a practical tool for decision-makers

    Science.gov (United States)

    Sohl, Terry L.; Claggett, Peter

    2013-01-01

    The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.

  15. Integrated Tools for Future Distributed Engine Control Technologies

    Science.gov (United States)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  16. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  17. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  18. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    Directory of Open Access Journals (Sweden)

    Chie Takahashi

    2011-10-01

    Full Text Available Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009. Variations in tool geometry also affect the reliability (precision of haptic size estimates, however, because they alter the change in hand opening caused by a given change in object size. Here, we examine whether the brain appropriately adjusts the weights given to visual and haptic size signals when tool geometry changes. We first estimated each cue's reliability by measuring size-discrimination thresholds in vision-alone and haptics-alone conditions. We varied haptic reliability using tools with different object-size:hand-opening ratios (1:1, 0.7:1, and 1.4:1. We then measured the weights given to vision and haptics with each tool, using a cue-conflict paradigm. The weight given to haptics varied with tool type in a manner that was well predicted by the single-cue reliabilities (MLE model; Ernst and Banks, 2002. This suggests that the process of visual-haptic integration appropriately accounts for variations in haptic reliability introduced by different tool geometries.

  19. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F. [ORNL; Poore III, Willis P. [ORNL; Muhlheim, Michael David [ORNL

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  20. A tool for efficient, model-independent management optimization under uncertainty

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  1. The Webinar Integration Tool: A Framework for Promoting Active Learning in Blended Environments

    Science.gov (United States)

    Lieser, Ping; Taf, Steven D.; Murphy-Hagan, Anne

    2018-01-01

    This paper describes a three-stage process of developing a webinar integration tool to enhance the interaction of teaching and learning in blended environments. In the context of medical education, we emphasize three factors of effective webinar integration in blended learning: fostering better solutions for faculty and students to interact…

  2. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  3. Modeling and Tool Wear in Routing of CFRP

    International Nuclear Information System (INIS)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.; Girot, F.

    2011-01-01

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  4. Diverse methods for integrable models

    NARCIS (Netherlands)

    Fehér, G.

    2017-01-01

    This thesis is centered around three topics, sharing integrability as a common theme. This thesis explores different methods in the field of integrable models. The first two chapters are about integrable lattice models in statistical physics. The last chapter describes an integrable quantum chain.

  5. System dynamics models as decision-making tools in agritourism

    Directory of Open Access Journals (Sweden)

    Jere Jakulin Tadeja

    2016-12-01

    Full Text Available Agritourism as a type of niche tourism is a complex and softly defined phaenomenon. The demands for fast and integrated decision regarding agritourism and its interconnections with environment, economy (investments, traffic and social factors (tourists is urgent. Many different methodologies and methods master softly structured questions and dilemmas with global and local properties. Here we present methods of systems thinking and system dynamics, which were first brought into force in the educational and training area in the form of different computer simulations and later as tools for decision-making and organisational re-engineering. We develop system dynamics models in order to present accuracy of methodology. These models are essentially simple and can serve only as describers of the activity of basic mutual influences among variables. We will pay the attention to the methodology for parameter model values determination and the so-called mental model. This one is the basis of causal connections among model variables. At the end, we restore a connection between qualitative and quantitative models in frame of system dynamics.

  6. Organization, maturation and plasticity of multisensory integration: Insights from computational modelling studies

    Directory of Open Access Journals (Sweden)

    Cristiano eCuppini

    2011-05-01

    Full Text Available In this paper, we present two neural network models - devoted to two specific and widely investigated aspects of multisensory integration - in order to evidence the potentialities of computational models to gain insight into the neural mechanisms underlying organization, development and plasticity of multisensory integration in the brain. The first model considers visual-auditory interaction in a midbrain structure named Superior Colliculus (SC. The model is able to reproduce and explain the main physiological features of multisensory integration in SC neurons and to describe how SC integrative capability – not present at birth - develops gradually during postnatal life depending on sensory experience with cross-modal stimuli. The second model tackles the problem of how tactile stimuli on a body part and visual (or auditory stimuli close to the same body part are integrated in multimodal parietal neurons to form the perception of peripersonal (i.e., near space. The model investigates how the extension of peripersonal space - where multimodal integration occurs - may be modified by experience such as use of a tool to interact with the far space. The utility of the modelling approach relies on several aspects: i The two models, although devoted to different problems and simulating different brain regions, share some common mechanisms (lateral inhibition and excitation, non-linear neuron characteristics, recurrent connections, competition, Hebbian rules of potentiation and depression that may govern more generally the fusion of senses in the brain, and the learning and plasticity of multisensory integration. ii The models may help interpretation of behavioural and psychophysical responses in terms of neural activity and synaptic connections. iii The models can make testable predictions that can help guiding future experiments in order to validate, reject, or modify the main assumptions.

  7. Achieving sustainable ground-water management by using GIS-integrated simulation tools: the EU H2020 FREEWAT platform

    Science.gov (United States)

    Rossetto, Rudy; De Filippis, Giovanna; Borsi, Iacopo; Foglia, Laura; Toegl, Anja; Cannata, Massimiliano; Neumann, Jakob; Vazquez-Sune, Enric; Criollo, Rotman

    2017-04-01

    In order to achieve sustainable and participated ground-water management, innovative software built on the integration of numerical models within GIS software is a perfect candidate to provide a full characterization of quantitative and qualitative aspects of ground- and surface-water resources maintaining the time and spatial dimension. The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management; Rossetto et al., 2015) aims at simplifying the application of EU water-related Directives through an open-source and public-domain, GIS-integrated simulation platform for planning and management of ground- and surface-water resources. The FREEWAT platform allows to simulate the whole hydrological cycle, coupling the power of GIS geo-processing and post-processing tools in spatial data analysis with that of process-based simulation models. This results in a modeling environment where large spatial datasets can be stored, managed and visualized and where several simulation codes (mainly belonging to the USGS MODFLOW family) are integrated to simulate multiple hydrological, hydrochemical or economic processes. So far, the FREEWAT platform is a large plugin for the QGIS GIS desktop software and it integrates the following capabilities: • the AkvaGIS module allows to produce plots and statistics for the analysis and interpretation of hydrochemical and hydrogeological data; • the Observation Analysis Tool, to facilitate the import, analysis and visualization of time-series data and the use of these data to support model construction and calibration; • groundwater flow simulation in the saturated and unsaturated zones may be simulated using MODFLOW-2005 (Harbaugh, 2005); • multi-species advective-dispersive transport in the saturated zone can be simulated using MT3DMS (Zheng & Wang, 1999); the possibility to simulate viscosity- and density-dependent flows is further accomplished through SEAWAT (Langevin et al., 2007); • sustainable

  8. The Virtual Watershed Observatory: Cyberinfrastructure for Model-Data Integration and Access

    Science.gov (United States)

    Duffy, C.; Leonard, L. N.; Giles, L.; Bhatt, G.; Yu, X.

    2011-12-01

    The Virtual Watershed Observatory (VWO) is a concept where scientists, water managers, educators and the general public can create a virtual observatory from integrated hydrologic model results, national databases and historical or real-time observations via web services. In this paper, we propose a prototype for automated and virtualized web services software using national data products for climate reanalysis, soils, geology, terrain and land cover. The VWO has the broad purpose of making accessible water resource simulations, real-time data assimilation, calibration and archival at the scale of HUC 12 watersheds (Hydrologic Unit Code) anywhere in the continental US. Our prototype for model-data integration focuses on creating tools for fast data storage from selected national databases, as well as the computational resources necessary for a dynamic, distributed watershed simulation. The paper will describe cyberinfrastructure tools and workflow that attempts to resolve the problem of model-data accessibility and scalability such that individuals, research teams, managers and educators can create a WVO in a desired context. Examples are given for the NSF-funded Shale Hills Critical Zone Observatory and the European Critical Zone Observatories within the SoilTrEC project. In the future implementation of WVO services will benefit from the development of a cloud cyber infrastructure as the prototype evolves to data and model intensive computation for continental scale water resource predictions.

  9. Integrated modelling of nitrate loads to coastal waters and land rent applied to catchment scale water management

    DEFF Research Database (Denmark)

    Jacosen, T.; Refsgaard, A.; Jacobsen, Brian H.

    Abstract The EU WFD requires an integrated approach to river basin management in order to meet environmental and ecological objectives. This paper presents concepts and full-scale application of an integrated modelling framework. The Ringkoebing Fjord basin is characterized by intensive agricultu...... in comprehensive, integrated modelling tools.......Abstract The EU WFD requires an integrated approach to river basin management in order to meet environmental and ecological objectives. This paper presents concepts and full-scale application of an integrated modelling framework. The Ringkoebing Fjord basin is characterized by intensive...... agricultural production and leakage of nitrate constitute a major pollution problem with respect groundwater aquifers (drinking water), fresh surface water systems (water quality of lakes) and coastal receiving waters (eutrophication). The case study presented illustrates an advanced modelling approach applied...

  10. Integrated modelling of nitrate loads to coastal waters and land rent applied to catchment-scale water management

    DEFF Research Database (Denmark)

    Refsgaard, A.; Jacobsen, T.; Jacobsen, Brian H.

    2007-01-01

    The EU Water Framework Directive (WFD) requires an integrated approach to river basin management in order to meet environmental and ecological objectives. This paper presents concepts and full-scale application of an integrated modelling framework. The Ringkoebing Fjord basin is characterized by ...... the potential and limitations of comprehensive, integrated modelling tools.  ......The EU Water Framework Directive (WFD) requires an integrated approach to river basin management in order to meet environmental and ecological objectives. This paper presents concepts and full-scale application of an integrated modelling framework. The Ringkoebing Fjord basin is characterized...... by intensive agricultural production and leakage of nitrate constitute a major pollution problem with respect groundwater aquifers (drinking water), fresh surface water systems (water quality of lakes) and coastal receiving waters (eutrophication). The case study presented illustrates an advanced modelling...

  11. Impact of electronic medical record integration of a handoff tool on sign-out in a newborn intensive care unit

    Science.gov (United States)

    Palma, JP; Sharek, PJ; Longhurst, CA

    2016-01-01

    Objective To evaluate the impact of integrating a handoff tool into the electronic medical record (EMR) on sign-out accuracy, satisfaction and workflow in a neonatal intensive care unit (NICU). Study Design Prospective surveys of neonatal care providers in an academic children’s hospital 1 month before and 6 months following EMR integration of a standalone Microsoft Access neonatal handoff tool. Result Providers perceived sign-out information to be somewhat or very accurate at a rate of 78% with the standalone handoff tool and 91% with the EMR-integrated tool (P < 0.01). Before integration of neonatal sign-out into the EMR, 35% of providers were satisfied with the process of updating sign-out information and 71% were satisfied with the printed sign-out document; following EMR integration, 92% of providers were satisfied with the process of updating sign-out information (P < 0.01) and 98% were satisfied with the printed sign-out document (P < 0.01). Neonatal care providers reported spending a median of 11 to 15 min/day updating the standalone sign-out and 16 to 20 min/day updating the EMR-integrated sign-out (P = 0.026). The median percentage of total sign-out preparation time dedicated to transcribing information from the EMR was 25 to 49% before and <25% after EMR integration of the handoff tool (P < 0.01). Conclusion Integration of a NICU-specific handoff tool into an EMR resulted in improvements in perceived sign-out accuracy, provider satisfaction and at least one aspect of workflow. PMID:21273990

  12. NARAC Dispersion Model Product Integration With RadResponder

    Energy Technology Data Exchange (ETDEWEB)

    Aluzzi, Fernando [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-30

    Work on enhanced cooperation and interoperability of Nuclear Incident Response Teams (NIRT) is a joint effort between DHS/FEMA, DOE/NNSA and EPA. One such effort was the integration between the RadResponder Network, a resource sponsored by FEMA for the management of radiological data during an emergency, and the National Atmospheric Advisory Center (NARAC), a DOE/NNSA modeling resource whose predictions are used to aid radiological emergency preparedness and response. Working together under a FEMA-sponsored project these two radiological response assets developed a capability to read and display plume model prediction results from the NARAC computer system in the RadResponder software tool. As a result of this effort, RadResponder users have been provided with NARAC modeling predictions of contamination areas, radiological dose levels, and protective action areas (e.g., areas warranting worker protection or sheltering/evacuation) to help guide protective action decisions and field monitoring surveys, and gain key situation awareness following a radiological/nuclear accident or incident (e.g., nuclear power plant accident, radiological dispersal device incident, or improvised nuclear detonation incident). This document describes the details of this integration effort.

  13. MODexplorer: an integrated tool for exploring protein sequence, structure and function relationships.

    KAUST Repository

    Kosinski, Jan; Barbato, Alessandro; Tramontano, Anna

    2013-01-01

    SUMMARY: MODexplorer is an integrated tool aimed at exploring the sequence, structural and functional diversity in protein families useful in homology modeling and in analyzing protein families in general. It takes as input either the sequence or the structure of a protein and provides alignments with its homologs along with a variety of structural and functional annotations through an interactive interface. The annotations include sequence conservation, similarity scores, ligand-, DNA- and RNA-binding sites, secondary structure, disorder, crystallographic structure resolution and quality scores of models implied by the alignments to the homologs of known structure. MODexplorer can be used to analyze sequence and structural conservation among the structures of similar proteins, to find structures of homologs solved in different conformational state or with different ligands and to transfer functional annotations. Furthermore, if the structure of the query is not known, MODexplorer can be used to select the modeling templates taking all this information into account and to build a comparative model. AVAILABILITY AND IMPLEMENTATION: Freely available on the web at http://modorama.biocomputing.it/modexplorer. Website implemented in HTML and JavaScript with all major browsers supported. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

  14. MODexplorer: an integrated tool for exploring protein sequence, structure and function relationships.

    KAUST Repository

    Kosinski, Jan

    2013-02-08

    SUMMARY: MODexplorer is an integrated tool aimed at exploring the sequence, structural and functional diversity in protein families useful in homology modeling and in analyzing protein families in general. It takes as input either the sequence or the structure of a protein and provides alignments with its homologs along with a variety of structural and functional annotations through an interactive interface. The annotations include sequence conservation, similarity scores, ligand-, DNA- and RNA-binding sites, secondary structure, disorder, crystallographic structure resolution and quality scores of models implied by the alignments to the homologs of known structure. MODexplorer can be used to analyze sequence and structural conservation among the structures of similar proteins, to find structures of homologs solved in different conformational state or with different ligands and to transfer functional annotations. Furthermore, if the structure of the query is not known, MODexplorer can be used to select the modeling templates taking all this information into account and to build a comparative model. AVAILABILITY AND IMPLEMENTATION: Freely available on the web at http://modorama.biocomputing.it/modexplorer. Website implemented in HTML and JavaScript with all major browsers supported. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

  15. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  16. INTEGRATING CORPUS-BASED RESOURCES AND NATURAL LANGUAGE PROCESSING TOOLS INTO CALL

    Directory of Open Access Journals (Sweden)

    Pascual Cantos Gomez

    2002-06-01

    Full Text Available This paper ainis at presenting a survey of computational linguistic tools presently available but whose potential has been neither fully considered not exploited to its full in modern CALL. It starts with a discussion on the rationale of DDL to language learning, presenting typical DDL-activities. DDL-software and potential extensions of non-typical DDL-software (electronic dictionaries and electronic dictionary facilities to DDL . An extended section is devoted to describe NLP-technology and how it can be integrated into CALL, within already existing software or as stand alone resources. A range of NLP-tools is presentcd (MT programs, taggers, lemn~atizersp, arsers and speech technologies with special emphasis on tagged concordancing. The paper finishes with a number of reflections and ideas on how language technologies can be used efficiently within the language learning context and how extensive exploration and integration of these technologies might change and extend both modern CAI,I, and the present language learning paradigiii..

  17. Integrating declarative knowledge programming styles and tools for building expert systems

    Energy Technology Data Exchange (ETDEWEB)

    Barbuceanu, M; Trausan-Matu, S; Molnar, B

    1987-01-01

    The XRL system reported in this paper is an integrated knowledge programming environment whose major research theme is the investigation of declarative knowledge programming styles and features and of the way they can be effectively integrated and used to support AI programming. This investigation is carried out in the context of the structured-object representation paradigm which provides the glue keeping XRL components together. The paper describes several declarative programming styles and associated support tools available in XRL. These include an instantiation system supporting a generalized view of the ubiquous frame installation process, a description based programming system providing a novel declarative programming style which embeds a mathematical oriented description language in the structured object environment and a transformational interpreter for using it, a semantics oriented programming framework which offers a specific semantic construct based approach supporting maintenance and evolution and a self description and self generation tool which applies the latter approach to XRL itself. 29 refs., 16 figs.

  18. Developing Integrated Care: Towards a development model for integrated care

    NARCIS (Netherlands)

    M.M.N. Minkman (Mirella)

    2012-01-01

    textabstractThe thesis adresses the phenomenon of integrated care. The implementation of integrated care for patients with a stroke or dementia is studied. Because a generic quality management model for integrated care is lacking, the study works towards building a development model for integrated

  19. IIASA's climate-vegetation-biogeochemical cycle module as a part of an integrated model for climate change

    International Nuclear Information System (INIS)

    Ganopolski, A.V.; Jonas, M.; Krabec, J.; Olendrzynski, K.; Petoukhov, V.K.; Venevsky, S.V.

    1994-01-01

    The main objective of this study is the development of a hierarchy of coupled climate biosphere models with a full description of the global biogeochemical cycles. These models are planned for use as the core of a set of integrated models of climate change and they will incorporate the main elements of the Earth system (atmosphere, hydrosphere, pedosphere and biosphere) linked with each other (and eventually with the antroposphere) through the fluxes of heat, momentum, water and through the global biogeochemical cycles of carbon and nitrogen. This set of integrated models can be considered to fill the gap between highly simplified integrated models of climate change and very sophisticated and computationally expensive coupled models, developed on the basis of general circulation models (GCMs). It is anticipated that this range of integrated models will be an effective tool for investigating the broad spectrum of problems connected with the coexistence of human society and biosphere

  20. An integrated risk assessment tool for team-based periodontal disease management.

    Science.gov (United States)

    Thyvalikakath, Thankam P; Padman, Rema; Gupta, Sugandh

    2013-01-01

    Mounting evidence suggests a potential association of periodontal disease with systemic diseases such as diabetes, cardiovascular disease, cancer and stroke. The objective of this study is to develop an integrated risk assessment tool that displays a patients' risk for periodontal disease in the context of their systemic disease, social habits and oral health. Such a tool will be used by not just dental professionals but also by care providers who participate in the team-based care for chronic disease management. Displaying relationships between risk factors and its influence on the patient's general health could be a powerful educational and disease management tool for patients and clinicians. It may also improve the coordination of care provided by the provider-members of a chronic care team.

  1. Boolean Dynamic Modeling Approaches to Study Plant Gene Regulatory Networks: Integration, Validation, and Prediction.

    Science.gov (United States)

    Velderraín, José Dávila; Martínez-García, Juan Carlos; Álvarez-Buylla, Elena R

    2017-01-01

    Mathematical models based on dynamical systems theory are well-suited tools for the integration of available molecular experimental data into coherent frameworks in order to propose hypotheses about the cooperative regulatory mechanisms driving developmental processes. Computational analysis of the proposed models using well-established methods enables testing the hypotheses by contrasting predictions with observations. Within such framework, Boolean gene regulatory network dynamical models have been extensively used in modeling plant development. Boolean models are simple and intuitively appealing, ideal tools for collaborative efforts between theorists and experimentalists. In this chapter we present protocols used in our group for the study of diverse plant developmental processes. We focus on conceptual clarity and practical implementation, providing directions to the corresponding technical literature.

  2. Examining the Technology Integration Planning Cycle Model of Professional Development to Support Teachers' Instructional Practices

    Science.gov (United States)

    Hutchison, Amy C.; Woodward, Lindsay

    2018-01-01

    Background: Presently, models of professional development aimed at supporting teachers' technology integration efforts are often short and decontextualized. With many schools across the country utilizing standards that require students to engage with digital tools, a situative model that supports building teachers' knowledge within their…

  3. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels......, and it is investigated how a non-uniform airflow influences the refrigerant mass flow rate distribution and the total cooling capacity of the heat exchanger. It is shown that the cooling capacity decreases significantly with increasing maldistribution of the airflow. Comparing the two simulation tools it is found...

  4. Indicators and measurement tools for health system integration: a knowledge synthesis protocol.

    Science.gov (United States)

    Oelke, Nelly D; Suter, Esther; da Silva Lima, Maria Alice Dias; Van Vliet-Brown, Cheryl

    2015-07-29

    Health system integration is a key component of health system reform with the goal of improving outcomes for patients, providers, and the health system. Although health systems continue to strive for better integration, current delivery of health services continues to be fragmented. A key gap in the literature is the lack of information on what successful integration looks like and how to measure achievement towards an integrated system. This multi-site study protocol builds on a prior knowledge synthesis completed by two of the primary investigators which identified 10 key principles that collectively support health system integration. The aim is to answer two research questions: What are appropriate indicators for each of the 10 key integration principles developed in our previous knowledge synthesis and what measurement tools are used to measure these indicators? To enhance generalizability of the findings, a partnership between Canada and Brazil was created as health system integration is a priority in both countries and they share similar contexts. This knowledge synthesis will follow an iterative scoping review process with emerging information from knowledge-user engagement leading to the refinement of research questions and study selection. This paper describes the methods for each phase of the study. Research questions were developed with stakeholder input. Indicator identification and prioritization will utilize a modified Delphi method and patient/user focus groups. Based on priority indicators, a search of the literature will be completed and studies screened for inclusion. Quality appraisal of relevant studies will be completed prior to data extraction. Results will be used to develop recommendations and key messages to be presented through integrated and end-of-grant knowledge translation strategies with researchers and knowledge-users from the three jurisdictions. This project will directly benefit policy and decision-makers by providing an easy

  5. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  6. Lotus Base: An integrated information portal for the model legume Lotus japonicus.

    Science.gov (United States)

    Mun, Terry; Bachmann, Asger; Gupta, Vikas; Stougaard, Jens; Andersen, Stig U

    2016-12-23

    Lotus japonicus is a well-characterized model legume widely used in the study of plant-microbe interactions. However, datasets from various Lotus studies are poorly integrated and lack interoperability. We recognize the need for a comprehensive repository that allows comprehensive and dynamic exploration of Lotus genomic and transcriptomic data. Equally important are user-friendly in-browser tools designed for data visualization and interpretation. Here, we present Lotus Base, which opens to the research community a large, established LORE1 insertion mutant population containing an excess of 120,000 lines, and serves the end-user tightly integrated data from Lotus, such as the reference genome, annotated proteins, and expression profiling data. We report the integration of expression data from the L. japonicus gene expression atlas project, and the development of tools to cluster and export such data, allowing users to construct, visualize, and annotate co-expression gene networks. Lotus Base takes advantage of modern advances in browser technology to deliver powerful data interpretation for biologists. Its modular construction and publicly available application programming interface enable developers to tap into the wealth of integrated Lotus data. Lotus Base is freely accessible at: https://lotus.au.dk.

  7. Non-integrated electricity suppliers: the failure of an organisational model

    International Nuclear Information System (INIS)

    Boroumand, R.H.

    2009-01-01

    In the reference model of market liberalization, the reference business model is the pure electricity retailer. But bankruptcy, merger or vertical integration are indicative of the failure of this organizational model and its incapacity to manage efficiently the combination of sourcing and market risks in a setting of fierce price competition. Because of the structural dimension of electricity's volume risk, a supplier's level of risk exposure is unknown ex ante and will only be revealed ex post when consumption is known. Sourcing and selling portfolios of hedging contracts are incomplete risk management tools. Consequently, physical hedging is an essential complement to portfolios of contracts to overcome the pure supplier's curse. (author)

  8. Progress in integrated energy-economy-environment model system development

    International Nuclear Information System (INIS)

    Yasukawa, Shigeru; Mankin, Shuichi; Sato, Osamu; Tadokoro, Yoshihiro; Nakano, Yasuyuki; Nagano, Takao

    1987-11-01

    The Integrated Energy-Economy-Environment Model System has been developed for providing analytical tools for the system analysis and technology assessments in the field of nuclear research and development. This model system consists of the following four model groups. The first model block installs 5 models and can serve to analyze and generate long-term scenarios on economy-energy-environment evolution. The second model block installs 2 models and can serve to analyze the structural transition phenomena in energy-economy-environment interactions. The third model block installs 2 models and can handle power reactor installation strategy problem and long-term fuel cycle analysis. The fourth model block installs 5 models and codes and can treats cost-benefit-risk analysis and assessments. This report describes mainly the progress and the outlines of application of the model system in these years after the first report on the research and development of the model system (JAERI-M 84 - 139). (author)

  9. Visual-haptic integration with pliers and tongs: signal ‘weights’ take account of changes in haptic sensitivity caused by different tools

    Directory of Open Access Journals (Sweden)

    Chie eTakahashi

    2014-02-01

    Full Text Available When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the ‘weight’ given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots with different ‘gains’ between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber’s law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modelled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimising the

  10. Integrating Social Networking Tools into ESL Writing Classroom: Strengths and Weaknesses

    Science.gov (United States)

    Yunus, Melor Md; Salehi, Hadi; Chenzi, Chen

    2012-01-01

    With the rapid development of world and technology, English learning has become more important. Teachers frequently use teacher-centered pedagogy that leads to lack of interaction with students. This paper aims to investigate the advantages and disadvantages of integrating social networking tools into ESL writing classroom and discuss the ways to…

  11. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  12. SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING AND RISK ASSESSMENT (SLIDE PRESENTATION)

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  13. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL, th...... model transformation tool sharing the model editor’s benefits, transparently....

  14. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  15. Process-Based Quality (PBQ) Tools Development; TOPICAL

    International Nuclear Information System (INIS)

    Cummins, J.L.

    2001-01-01

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts

  16. Improved efficiency in clinical workflow of reporting measured oncology lesions via PACS-integrated lesion tracking tool.

    Science.gov (United States)

    Sevenster, Merlijn; Travis, Adam R; Ganesh, Rajiv K; Liu, Peng; Kose, Ursula; Peters, Joost; Chang, Paul J

    2015-03-01

    OBJECTIVE. Imaging provides evidence for the response to oncology treatment by the serial measurement of reference lesions. Unfortunately, the identification, comparison, measurement, and documentation of several reference lesions can be an inefficient process. We tested the hypothesis that optimized workflow orchestration and tight integration of a lesion tracking tool into the PACS and speech recognition system can result in improvements in oncologic lesion measurement efficiency. SUBJECTS AND METHODS. A lesion management tool tightly integrated into the PACS workflow was developed. We evaluated the effect of the use of the tool on measurement reporting time by means of a prospective time-motion study on 86 body CT examinations with 241 measureable oncologic lesions with four radiologists. RESULTS. Aggregated measurement reporting time per lesion was 11.64 seconds in standard workflow, 16.67 seconds if readers had to register measurements de novo, and 6.36 seconds for each subsequent follow-up study. Differences were statistically significant (p workflow-integrated lesion management tool, especially for patients with multiple follow-up examinations, reversing the onetime efficiency penalty at baseline registration.

  17. TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.

    Science.gov (United States)

    Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David

    2016-10-01

    Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH). © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  18. Mechanisms for integration of information models across related domains

    Science.gov (United States)

    Atkinson, Rob

    2010-05-01

    It is well recognised that there are opportunities and challenges in cross-disciplinary data integration. A significant barrier, however, is creating a conceptual model of the combined domains and the area of integration. For example, a groundwater domain application may require information from several related domains: geology, hydrology, water policy, etc. Each domain may have its own data holdings and conceptual models, but these will share various common concepts (eg. The concept of an aquifer). These areas of semantic overlap present significant challenges, firstly to choose a single representation (model) of a concept that appears in multiple disparate models,, then to harmonise these other models with the single representation. In addition, models may exist at different levels of abstraction depending on how closely aligned they are with a particular implementation. This makes it hard for modellers in one domain to introduce elements from another domain without either introducing a specific style of implementation, or conversely dealing with a set of abstract patterns that are hard to integrate with existing implementations. Models are easier to integrate if they are broken down into small units, with common concepts implemented using common models from well-known, and predictably managed shared libraries. This vision however requires development of a set of mechanisms (tools and procedures) for implementing and exploiting libraries of model components. These mechanisms need to handle publication, discovery, subscription, versioning and implementation of models in different forms. In this presentation a coherent suite of such mechanisms is proposed, using a scenario based on re-use of geosciences models. This approach forms the basis of a comprehensive strategy to empower domain modellers to create more interoperable systems. The strategy address a range of concerns and practice, and includes methodologies, an accessible toolkit, improvements to available

  19. An integrated audio-visual impact tool for wind turbine installations

    International Nuclear Information System (INIS)

    Lymberopoulos, N.; Belessis, M.; Wood, M.; Voutsinas, S.

    1996-01-01

    An integrated software tool was developed for the design of wind parks that takes into account their visual and audio impact. The application is built on a powerful hardware platform and is fully operated through a graphic user interface. The topography, the wind turbines and the daylight conditions are realised digitally. The wind park can be animated in real time and the user can take virtual walks in it while the set-up of the park can be altered interactively. In parallel, the wind speed levels on the terrain, the emitted noise intensity, the annual energy output and the cash flow can be estimated at any stage of the session and prompt the user for rearrangements. The tool has been used to visually simulate existing wind parks in St. Breok, UK and Andros Island, Greece. The results lead to the conclusion that such a tool can assist to the public acceptance and licensing procedures of wind parks. (author)

  20. EVALUATION OF THE GRAI INTEGRATED METHODOLOGY AND THE IMAGIM SUPPORTWARE

    Directory of Open Access Journals (Sweden)

    J.M.C. Reid

    2012-01-01

    Full Text Available This paper describes the GRAI Integrated Methodology and identifies the need for computer tools to support enterprise modelling,design and integration. The IMAGIM tool is then evaluated in terms of its ability to support the GRAI Integrated Methodology. The GRAI Integrated Methodology is an Enterprise Integration methodology developed to support the design of CIM systems . The GRAI Integrated Methodology consists of the GRAI model and a structured approach. The latest addition to the methodology is the IMAGIM software tool developed by the GRAI research group for the specific purpose of supporting the methodology.

  1. Opportunites for Integrated Landscape Planning – the Broker, the Arena, the Tool

    Directory of Open Access Journals (Sweden)

    Julia Carlsson

    2017-12-01

    Full Text Available As an integrated social and ecological system, the forest landscape includes multiple values. The need for a landscape pproach in land use planning is being increasingly advocated in research, policy and practice. This paper explores how institutional conditions in the forest policy and management sector can be developed to meet demands for a multifunctional landscape perspective. Departing from obstacles recognised in collaborative planning literature, we build an analytical framework which is operationalised in a Swedish context at municipal level. Our case illustrating this is Vilhelmina Model Forest, where actual barriers and opportunities for a multiple-value landscape approach are identified through 32 semi-structured interviews displaying stakeholders’ views on forest values,ownership rights and willingness to consider multiple values, forest policy and management premises, and collaboration. As an opportunity to overcome the barriers, we suggest and discuss three key components by which an integrated landscape planning approach could be realized in forest management planning: the need for a landscape coordinator (broker, the need for a collaborative forum (arena, and the development of the existing forest management plan into an advanced multifunctional landscape plan (tool.

  2. Analyzing Unsaturated Flow Patterns in Fractured Rock Using an Integrated Modeling Approach

    International Nuclear Information System (INIS)

    Y.S. Wu; G. Lu; K. Zhang; L. Pan; G.S. Bodvarsson

    2006-01-01

    Characterizing percolation patterns in unsaturated fractured rock has posed a greater challenge to modeling investigations than comparable saturated zone studies, because of the heterogeneous nature of unsaturated media and the great number of variables impacting unsaturated flow. This paper presents an integrated modeling methodology for quantitatively characterizing percolation patterns in the unsaturated zone of Yucca Mountain, Nevada, a proposed underground repository site for storing high-level radioactive waste. The modeling approach integrates a wide variety of moisture, pneumatic, thermal, and isotopic geochemical field data into a comprehensive three-dimensional numerical model for modeling analyses. It takes into account the coupled processes of fluid and heat flow and chemical isotopic transport in Yucca Mountain's highly heterogeneous, unsaturated fractured tuffs. Modeling results are examined against different types of field-measured data and then used to evaluate different hydrogeological conceptualizations and their results of flow patterns in the unsaturated zone. In particular, this model provides a much clearer understanding of percolation patterns and flow behavior through the unsaturated zone, both crucial issues in assessing repository performance. The integrated approach for quantifying Yucca Mountain's flow system is demonstrated to provide a practical modeling tool for characterizing flow and transport processes in complex subsurface systems

  3. Integration agent-based models and GIS as a virtual urban dynamic laboratory

    Science.gov (United States)

    Chen, Peng; Liu, Miaolong

    2007-06-01

    Based on the Agent-based Model and spatial data model, a tight-coupling integrating method of GIS and Agent-based Model (ABM) is to be discussed in this paper. The use of object-orientation for both spatial data and spatial process models facilitates their integration, which can allow exploration and explanation of spatial-temporal phenomena such as urban dynamic. In order to better understand how tight coupling might proceed and to evaluate the possible functional and efficiency gains from such a tight coupling, the agent-based model and spatial data model are discussed, and then the relationships affecting spatial data model and agent-based process models interaction. After that, a realistic crowd flow simulation experiment is presented. Using some tools provided by general GIS systems and a few specific programming languages, a new software system integrating GIS and MAS as a virtual laboratory applicable for simulating pedestrian flows in a crowd activity centre has been developed successfully. Under the environment supported by the software system, as an applicable case, a dynamic evolution process of the pedestrian's flows (dispersed process for the spectators) in a crowds' activity center - The Shanghai Stadium has been simulated successfully. At the end of the paper, some new research problems have been pointed out for the future.

  4. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  5. Data, models, and views: towards integration of diverse numerical model components and data sets for scientific and public dissemination

    Science.gov (United States)

    Hofmeister, Richard; Lemmen, Carsten; Nasermoaddeli, Hassan; Klingbeil, Knut; Wirtz, Kai

    2015-04-01

    Data and models for describing coastal systems span a diversity of disciplines, communities, ecosystems, regions and techniques. Previous attempts of unifying data exchange, coupling interfaces, or metadata information have not been successful. We introduce the new Modular System for Shelves and Coasts (MOSSCO, http://www.mossco.de), a novel coupling framework that enables the integration of a diverse array of models and data from different disciplines relating to coastal research. In the MOSSCO concept, the integrating framework imposes very few restrictions on contributed data or models; in fact, there is no distinction made between data and models. The few requirements are: (1) principle coupleability, i.e. access to I/O and timing information in submodels, which has recently been referred to as the Basic Model Interface (BMI) (2) open source/open data access and licencing and (3) communication of metadata, such as spatiotemporal information, naming conventions, and physical units. These requirements suffice to integrate different models and data sets into the MOSSCO infrastructure and subsequently built a modular integrated modeling tool that can span a diversity of processes and domains. We demonstrate how diverse coastal system constituents were integrated into this modular framework and how we deal with the diverging development of constituent data sets and models at external institutions. Finally, we show results from simulations with the fully coupled system using OGC WebServices in the WiMo geoportal (http://kofserver3.hzg.de/wimo), from where stakeholders can view the simulation results for further dissemination.

  6. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  7. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  8. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  9. Implementing Case Tools in the Inteligent Telecommunication Systems

    OpenAIRE

    Bahador Ghahramani; Azad Azadmanesh

    2003-01-01

    This paper discusses an intelligent and Internet-based Telecommunication System Specification Model (TSSM) using Computer Aided Systems Engineering tools (CASE tools). TSSM implements CASE tools to mechanize its lifecycle development maintenance and integration process. This model is developed to improve the system analysts (SA) efforts in their design and development of major software and hardware initiatives. This model also improves the SA effectiveness by guiding them through the system's...

  10. The Role of Integrated Modelling and Assessment for Decision-Making: Lessons from Water Allocation Issues in Australia

    Science.gov (United States)

    Jakeman, A. J.; Guillaume, J. H. A.; El Sawah, S.; Hamilton, S.

    2014-12-01

    Integrated modelling and assessment (IMA) is best regarded as a process that can support environmental decision-making when issues are strongly contested and uncertainties pervasive. To be most useful, the process must be multi-dimensional and phased. Principally, it must be tailored to the problem context to encompass diverse issues of concern, management settings and stakeholders. This in turn requires the integration of multiple processes and components of natural and human systems and their corresponding spatial and temporal scales. Modellers therefore need to be able to integrate multiple disciplines, methods, models, tools and data, and many sources and types of uncertainty. These dimensions are incorporated into iteration between the various phases of the IMA process, including scoping, problem framing and formulation, assessing options and communicating findings. Two case studies in Australia are employed to share the lessons of how integration can be achieved in these IMA phases using a mix of stakeholder participation processes and modelling tools. One case study aims to improve the relevance of modelling by incorporating stakeholder's views of irrigated viticulture and water management decision making. It used a novel methodology with the acronym ICTAM, consisting of Interviews to elicit mental models, Cognitive maps to represent and analyse individual and group mental models, Time-sequence diagrams to chronologically structure the decision making process, an All-encompassing conceptual model, and computational Models of stakeholder decision making. The second case uses a hydro-economic river network model to examine basin-wide impacts of water allocation cuts and adoption of farm innovations. The knowledge exchange approach used in each case was designed to integrate data and knowledge bearing in mind the contextual dimensions of the problem at hand, and the specific contributions that environmental modelling was thought to be able to make.

  11. Visualization of RNA structure models within the Integrative Genomics Viewer.

    Science.gov (United States)

    Busan, Steven; Weeks, Kevin M

    2017-07-01

    Analyses of the interrelationships between RNA structure and function are increasingly important components of genomic studies. The SHAPE-MaP strategy enables accurate RNA structure probing and realistic structure modeling of kilobase-length noncoding RNAs and mRNAs. Existing tools for visualizing RNA structure models are not suitable for efficient analysis of long, structurally heterogeneous RNAs. In addition, structure models are often advantageously interpreted in the context of other experimental data and gene annotation information, for which few tools currently exist. We have developed a module within the widely used and well supported open-source Integrative Genomics Viewer (IGV) that allows visualization of SHAPE and other chemical probing data, including raw reactivities, data-driven structural entropies, and data-constrained base-pair secondary structure models, in context with linear genomic data tracks. We illustrate the usefulness of visualizing RNA structure in the IGV by exploring structure models for a large viral RNA genome, comparing bacterial mRNA structure in cells with its structure under cell- and protein-free conditions, and comparing a noncoding RNA structure modeled using SHAPE data with a base-pairing model inferred through sequence covariation analysis. © 2017 Busan and Weeks; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  12. Combining the Generic Entity-Attribute-Value Model and Terminological Models into a Common Ontology to Enable Data Integration and Decision Support.

    Science.gov (United States)

    Bouaud, Jacques; Guézennec, Gilles; Séroussi, Brigitte

    2018-01-01

    The integration of clinical information models and termino-ontological models into a unique ontological framework is highly desirable for it facilitates data integration and management using the same formal mechanisms for both data concepts and information model components. This is particularly true for knowledge-based decision support tools that aim to take advantage of all facets of semantic web technologies in merging ontological reasoning, concept classification, and rule-based inferences. We present an ontology template that combines generic data model components with (parts of) existing termino-ontological resources. The approach is developed for the guideline-based decision support module on breast cancer management within the DESIREE European project. The approach is based on the entity attribute value model and could be extended to other domains.

  13. Integrated Personal Health Records: Transformative Tools for Consumer-Centric Care

    Directory of Open Access Journals (Sweden)

    Raymond Brian

    2008-10-01

    Full Text Available Abstract Background Integrated personal health records (PHRs offer significant potential to stimulate transformational changes in health care delivery and self-care by patients. In 2006, an invitational roundtable sponsored by Kaiser Permanente Institute, the American Medical Informatics Association, and the Agency for Healthcare Research and Quality was held to identify the transformative potential of PHRs, as well as barriers to realizing this potential and a framework for action to move them closer to the health care mainstream. This paper highlights and builds on the insights shared during the roundtable. Discussion While there is a spectrum of dominant PHR models, (standalone, tethered, integrated, the authors state that only the integrated model has true transformative potential to strengthen consumers' ability to manage their own health care. Integrated PHRs improve the quality, completeness, depth, and accessibility of health information provided by patients; enable facile communication between patients and providers; provide access to health knowledge for patients; ensure portability of medical records and other personal health information; and incorporate auto-population of content. Numerous factors impede widespread adoption of integrated PHRs: obstacles in the health care system/culture; issues of consumer confidence and trust; lack of technical standards for interoperability; lack of HIT infrastructure; the digital divide; uncertain value realization/ROI; and uncertain market demand. Recent efforts have led to progress on standards for integrated PHRs, and government agencies and private companies are offering different models to consumers, but substantial obstacles remain to be addressed. Immediate steps to advance integrated PHRs should include sharing existing knowledge and expanding knowledge about them, building on existing efforts, and continuing dialogue among public and private sector stakeholders. Summary Integrated PHRs

  14. The Innsbruck/ESO sky models and telluric correction tools*

    Directory of Open Access Journals (Sweden)

    Kimeswenger S.

    2015-01-01

    While the ground based astronomical observatories just have to correct for the line-of-sight integral of these effects, the Čerenkov telescopes use the atmosphere as the primary detector. The measured radiation originates at lower altitudes and does not pass through the entire atmosphere. Thus, a decent knowledge of the profile of the atmosphere at any time is required. The latter cannot be achieved by photometric measurements of stellar sources. We show here the capabilities of our sky background model and data reduction tools for ground-based optical/infrared telescopes. Furthermore, we discuss the feasibility of monitoring the atmosphere above any observing site, and thus, the possible application of the method for Čerenkov telescopes.

  15. Supercritical kinetic analysis in simplified system of fuel debris using integral kinetic model

    International Nuclear Information System (INIS)

    Tuya, Delgersaikhan; Obara, Toru

    2016-01-01

    Highlights: • Kinetic analysis in simplified weakly coupled fuel debris system was performed. • The integral kinetic model was used to simulate criticality accidents. • The fission power and released energy during simulated accident were obtained. • Coupling between debris regions and its effect on the fission power was obtained. - Abstract: Preliminary prompt supercritical kinetic analyses in a simplified coupled system of fuel debris designed to roughly resemble a melted core of a nuclear reactor were performed using an integral kinetic model. The integral kinetic model, which can describe region- and time-dependent fission rate in a coupled system of arbitrary geometry, was used because the fuel debris system is weakly coupled in terms of neutronics. The results revealed some important characteristics of coupled systems, such as the coupling between debris regions and the effect of the coupling on the fission rate and released energy in each debris region during the simulated criticality accident. In brief, this study showed that the integral kinetic model can be applied to supercritical kinetic analysis in fuel debris systems and also that it can be a useful tool for investigating the effect of the coupling on consequences of a supercritical accident.

  16. Integrated modeling approach for optimal management of water, energy and food security nexus

    Science.gov (United States)

    Zhang, Xiaodong; Vesselinov, Velimir V.

    2017-03-01

    Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.

  17. Qualitative Analysis of Integration Adapter Modeling

    OpenAIRE

    Ritter, Daniel; Holzleitner, Manuel

    2015-01-01

    Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

  18. An EFQM excellence model for integrated healthcare governance.

    Science.gov (United States)

    Favaretti, Carlo; De Pieri, Paolo; Torri, Emanuele; Guarrera, Giovanni; Fontana, Fabrizio; Debiasi, Franco; Flor, Luciano

    2015-01-01

    The purpose of this paper is to account for a ten-year experience with the European Foundation for Quality Management (EFQM) Excellence Model implemented in the Trento Healthcare Trust. Since 2000, the EFQM Excellence Model provided an overarching framework to streamline business process governance, to support and improve its enablers and results. From 2000 to 2009, staff performed four internal (self) and four external EFQM-based assessments that provided guidance for an integrated management system. Over the years, key controls and assurances improved service quality through business planning, learning and practice cycles. Rising assessment ratings and improving results characterized the journey. The average self-assessment score (on a 1,000 points scale) was 290 in 2001, which increased to 610 in 2008. Since 2006, the Trust has been Recognized for Excellence (four stars). The organization improved significantly on customer satisfaction, people results and key service delivery and outcomes. The EFQM Model can act as an effective tool to meet governance demands and promote system-level results. The approach to integrated governance discussed here may support similar change processes in comparable organizations. The paper describes a unique experience when implementing EFQM within a large Italian healthcare system, which had a broader reach and lasted longer than any experience in Italian healthcare.

  19. A parameter optimization tool for evaluating the physical consistency of the plot-scale water budget of the integrated eco-hydrological model GEOtop in complex terrain

    Science.gov (United States)

    Bertoldi, Giacomo; Cordano, Emanuele; Brenner, Johannes; Senoner, Samuel; Della Chiesa, Stefano; Niedrist, Georg

    2017-04-01

    In mountain regions, the plot- and catchment-scale water and energy budgets are controlled by a complex interplay of different abiotic (i.e. topography, geology, climate) and biotic (i.e. vegetation, land management) controlling factors. When integrated, physically-based eco-hydrological models are used in mountain areas, there are a large number of parameters, topographic and boundary conditions that need to be chosen. However, data on soil and land-cover properties are relatively scarce and do not reflect the strong variability at the local scale. For this reason, tools for uncertainty quantification and optimal parameters identification are essential not only to improve model performances, but also to identify most relevant parameters to be measured in the field and to evaluate the impact of different assumptions for topographic and boundary conditions (surface, lateral and subsurface water and energy fluxes), which are usually unknown. In this contribution, we present the results of a sensitivity analysis exercise for a set of 20 experimental stations located in the Italian Alps, representative of different conditions in terms of topography (elevation, slope, aspect), land use (pastures, meadows, and apple orchards), soil type and groundwater influence. Besides micrometeorological parameters, each station provides soil water content at different depths, and in three stations (one for each land cover) eddy covariance fluxes. The aims of this work are: (I) To present an approach for improving calibration of plot-scale soil moisture and evapotranspiration (ET). (II) To identify the most sensitive parameters and relevant factors controlling temporal and spatial differences among sites. (III) Identify possible model structural deficiencies or uncertainties in boundary conditions. Simulations have been performed with the GEOtop 2.0 model, which is a physically-based, fully distributed integrated eco-hydrological model that has been specifically designed for mountain

  20. The integrated model of innovative processes management in foreign countries

    Directory of Open Access Journals (Sweden)

    M. T. Kurametova

    2017-01-01

    Full Text Available The formation of an innovative economy must correspond to the promising areas of development of scientific, technical and social progress. To ensure sustainable innovative development of the national economy, it is not only necessary to develop our own tools and mechanisms that are characteristic of the domestic management model, but also the rational use of foreign experience in this field. Analysis of international experience in the use of various tools and mechanisms, management structures for the creation of high-tech and knowledge-based enterprises showed: the integrated nature of innovative development and modernization of the economy is the most sound methodological approach of a phased, systemic transition to new technological structures; When developing tools and mechanisms for innovative development of the economy, one should take into account the actual state of the material and technical base and the existing industrial structure of production, take into account the real possibilities of using different types of resources. The greatest innovation activity is shown by those countries in which the national integrated system effectively provides favorable conditions for the development and introduction of innovations in various spheres of life. International experience in the use of forms of governance can be considered as a mobile system of relations with the real sector of the economy. In the article is given the experience of foreign countries, and examples of adaptation for Kazakhstan integrated models of management of innovative processes to create high-tech enterprises, innovative products which can be competitive in the world market. The author highlighted the role of JSC “Kazakhtelecom” with the widespread provision of public services, having the status of a National operator associated with the provision of the services including long-distance and an international telecommunication for telecommunication networks in General

  1. Planning for Integrating Teaching Technologies

    Directory of Open Access Journals (Sweden)

    Mandie Aaron

    2004-06-01

    Full Text Available Teaching technologies offer pedagogical advantages which vary with specific contexts. Successfully integrating them hinges on clearly identifying pedagogical goals, then planning for the many decisions that technological change demands. In examining different ways of organizing this process, we have applied planning tools from other domains - Fault Tree Analysis and Capability Maturity Modeling- at the school and college levels. In another approach, we have examined attempts to broadly model the integration process at the university level. Our studies demonstrate that the use of a variety of tools and techniques can render the integration of teaching technologies more systematic.

  2. INTEGRATED SPEED ESTIMATION MODEL FOR MULTILANE EXPREESSWAYS

    Science.gov (United States)

    Hong, Sungjoon; Oguchi, Takashi

    In this paper, an integrated speed-estimation model is developed based on empirical analyses for the basic sections of intercity multilane expressway un der the uncongested condition. This model enables a speed estimation for each lane at any site under arb itrary highway-alignment, traffic (traffic flow and truck percentage), and rainfall conditions. By combin ing this model and a lane-use model which estimates traffic distribution on the lanes by each vehicle type, it is also possible to es timate an average speed across all the lanes of one direction from a traffic demand by vehicle type under specific highway-alignment and rainfall conditions. This model is exp ected to be a tool for the evaluation of traffic performance for expressways when the performance me asure is travel speed, which is necessary for Performance-Oriented Highway Planning and Design. Regarding the highway-alignment condition, two new estimators, called effective horizo ntal curvature and effective vertical grade, are proposed in this paper which take into account the influence of upstream and downstream alignment conditions. They are applied to the speed-estimation model, and it shows increased accuracy of the estimation.

  3. Decision-Making for Supply Chain Integration Supply Chain Integration

    CERN Document Server

    Lettice, Fiona; Durowoju, Olatunde

    2012-01-01

    Effective supply chain integration, and the tight co-ordination it creates, is an essential pre-requisite for successful supply chain management.  Decision-Making for Supply Chain Integration is a practical reference on recent research in the area of supply chain integration focusing on distributed decision-making problems. Recent applications of various decision-making tools for integrating supply chains are covered including chapters focusing on: •Supplier selection, pricing strategy and inventory decisions in multi-level supply chains, •RFID-enabled distributed decision-making, •Operational risk issues and time-critical decision-making for sensitive logistics nodes, Modelling end to end processes to improve supply chain integration, and •Integrated systems to improve service delivery and optimize resource use. Decision-Making for Supply Chain Integration provides an insight into the tools and methodologies of this field with support from real-life case studies demonstrating successful application ...

  4. CMS Partial Releases Model, Tools, and Applications. Online and Framework-Light Releases

    CERN Document Server

    Jones, Christopher D; Meschi, Emilio; Shahzad Muzaffar; Andreas Pfeiffer; Ratnikova, Natalia; Sexton-Kennedy, Elizabeth

    2009-01-01

    The CMS Software project CMSSW embraces more than a thousand packages organized in subsystems for analysis, event display, reconstruction, simulation, detector description, data formats, framework, utilities and tools. The release integration process is highly automated by using tools developed or adopted by CMS. Packaging in rpm format is a built-in step in the software build process. For several well-defined applications it is highly desirable to have only a subset of the CMSSW full package bundle. For example, High Level Trigger algorithms that run on the Online farm, and need to be rebuilt in a special way, require no simulation, event display, or analysis packages. Physics analysis applications in Root environment require only a few core libraries and the description of CMS specific data formats. We present a model of CMS Partial Releases, used for preparation of the customized CMS software builds, including description of the tools used, the implementation, and how we deal with technical challenges, suc...

  5. Integrability of the Rabi Model

    International Nuclear Information System (INIS)

    Braak, D.

    2011-01-01

    The Rabi model is a paradigm for interacting quantum systems. It couples a bosonic mode to the smallest possible quantum model, a two-level system. I present the analytical solution which allows us to consider the question of integrability for quantum systems that do not possess a classical limit. A criterion for quantum integrability is proposed which shows that the Rabi model is integrable due to the presence of a discrete symmetry. Moreover, I introduce a generalization with no symmetries; the generalized Rabi model is the first example of a nonintegrable but exactly solvable system.

  6. A review on the integration of artificial intelligence into coastal modeling.

    Science.gov (United States)

    Chau, Kwokwing

    2006-07-01

    With the development of computing technology, mechanistic models are often employed to simulate processes in coastal environments. However, these predictive tools are inevitably highly specialized, involving certain assumptions and/or limitations, and can be manipulated only by experienced engineers who have a thorough understanding of the underlying theories. This results in significant constraints on their manipulation as well as large gaps in understanding and expectations between the developers and practitioners of a model. The recent advancements in artificial intelligence (AI) technologies are making it possible to integrate machine learning capabilities into numerical modeling systems in order to bridge the gaps and lessen the demands on human experts. The objective of this paper is to review the state-of-the-art in the integration of different AI technologies into coastal modeling. The algorithms and methods studied include knowledge-based systems, genetic algorithms, artificial neural networks, and fuzzy inference systems. More focus is given to knowledge-based systems, which have apparent advantages over the others in allowing more transparent transfers of knowledge in the use of models and in furnishing the intelligent manipulation of calibration parameters. Of course, the other AI methods also have their individual contributions towards accurate and reliable predictions of coastal processes. The integrated model might be very powerful, since the advantages of each technique can be combined.

  7. First Steps Towards AN Integrated Citygml-Based 3d Model of Vienna

    Science.gov (United States)

    Agugiaro, G.

    2016-06-01

    This paper presents and discusses the results regarding the initial steps (selection, analysis, preparation and eventual integration of a number of datasets) for the creation of an integrated, semantic, three-dimensional, and CityGML-based virtual model of the city of Vienna. CityGML is an international standard conceived specifically as information and data model for semantic city models at urban and territorial scale. It is being adopted by more and more cities all over the world. The work described in this paper is embedded within the European Marie-Curie ITN project "Ci-nergy, Smart cities with sustainable energy systems", which aims, among the rest, at developing urban decision making and operational optimisation software tools to minimise non-renewable energy use in cities. Given the scope and scale of the project, it is therefore vital to set up a common, unique and spatio-semantically coherent urban model to be used as information hub for all applications being developed. This paper reports about the experiences done so far, it describes the test area and the available data sources, it shows and exemplifies the data integration issues, the strategies developed to solve them in order to obtain the integrated 3D city model. The first results as well as some comments about their quality and limitations are presented, together with the discussion regarding the next steps and some planned improvements.

  8. The Integration of Digital Tools during Strategic and Interactive Writing Instruction

    Science.gov (United States)

    Kilpatrick, Jennifer Renée; Saulsburry, Rachel; Dostal, Hannah M.; Wolbers, Kimberly A.; Graham, Steve

    2014-01-01

    The purpose of this chapter is to gain insight from the ways a group of elementary teachers of the deaf and hard of hearing chose to integrate digital tools into evidence-based writing instruction and the ways these technologies were used to support student learning. After professional development that exposed these teachers to twelve new digital…

  9. Methodological Bases for Describing Risks of the Enterprise Business Model in Integrated Reporting

    Directory of Open Access Journals (Sweden)

    Nesterenko Oksana O.

    2017-12-01

    Full Text Available The aim of the article is to substantiate the methodological bases for describing the business and accounting risks of an enterprise business model in integrated reporting for their timely detection and assessment, and develop methods for their leveling or minimizing and possible prevention. It is proposed to consider risks in the process of forming integrated reporting from two sides: first, risks that arise in the business model of an organization and should be disclosed in its integrated report; second, accounting risks of integrated reporting, which should be taken into account by members of the cross-sectoral working group and management personnel in the process of forming and promulgating integrated reporting. To develop an adequate accounting and analytical tool for disclosure of information about the risks of the business model and integrated reporting, their leveling or minimization, in the article a terminological analysis of the essence of entrepreneurial and accounting risks is carried out. The entrepreneurial risk is defined as an objective-subjective economic category that characterizes the probability of negative or positive consequences of economic-social-ecological activity within the framework of the business model of an enterprise under uncertainty. The accounting risk is suggested to be understood as the probability of unfavorable consequences as a result of organizational, methodological errors in the integrated accounting system, which present threat to the quality, accuracy and reliability of the reporting information on economic, social and environmental activities in integrated reporting as well as threat of inappropriate decision-making by stakeholders based on the integrated report. For the timely identification of business risks and maximum leveling of the influence of accounting risks on the process of formation and publication of integrated reporting, in the study the place of entrepreneurial and accounting risks in

  10. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    Science.gov (United States)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  11. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  12. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  13. BEopt-CA (Ex): A Tool for Optimal Integration of EE, DR and PV in Existing California Homes

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, Craig [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, Scott [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, Jeff [National Renewable Energy Lab. (NREL), Golden, CO (United States); Velasco, Paulo Tabrares [National Renewable Energy Lab. (NREL), Golden, CO (United States); Springer, David [Davis Energy Group, Davis, CA (United States); Coates, Peter [Davis Energy Group, Davis, CA (United States); Bell, Christy [Davis Energy Group, Davis, CA (United States); Price, Snuller [Energy & Environmental Economics, San Francisco, CA (United States); Sreedharan, Priya [Energy & Environmental Economics, San Francisco, CA (United States); Pickrell, Katie [Energy & Environmental Economics, San Francisco, CA (United States)

    2014-04-01

    This project targeted the development of a software tool, BEopt-CA (Ex) (Building Energy Optimization Tool for California Existing Homes), that aims to facilitate balanced integration of energy efficiency (EE), demand response (DR), and photovoltaics (PV) in the residential retrofit1 market. The intent is to provide utility program managers and contractors in the EE/DR/PV marketplace with a means of balancing the integration of EE, DR, and PV

  14. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    Baier, Roman; Zander, Andre

    2008-01-01

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  15. OMNIITOX - operational life-cycle impact assessment models and information tools for practitioners

    DEFF Research Database (Denmark)

    Molander, S; Lidholm, Peter; Schowanek, Diederik

    2004-01-01

    of the characterisation model(s) and limited input data on chemical properties, which often has resulted in the omission of toxicants from the LCIA, or at best focus on well characterised chemicals. The project addresses both problems and integrates models, as well as data, in an information system – the OMNIITOX IS....... There is also a need for clarification of the relations between the (environmental) risk assessments of toxicants and LCIA, in addition to investigating the feasibility of introducing LCA into European chemicals legislation, tasks that also were addressed in the project.......This article is the preamble to a set of articles describing initial results from an on-going European Commission funded, 5th Framework project called OMNIITOX, Operational Models aNd Information tools for Industrial applications of eco/TOXicological impact assessments. The different parts...

  16. The Venetian Ghetto: Semantic Modelling for an Integrated Analysis

    Directory of Open Access Journals (Sweden)

    Alessandra Ferrighi

    2017-12-01

    Full Text Available In the digital era, historians are embracing information technology as a research tool. New technologies offer investigation and interpretation, synthesis and communication tools that are more effective than the more traditional study methods, as they guarantee a multidisciplinary approach and analyses integration. Among the available technologies the best suited for the study or urban phenomena are databases (DB, the Geographic Information System (GIS, the Building Information Modelling (BIM and the multimedia tools (Video, APP for the dissemination of results. The case study described here concerns the analysis of part of Venice that changed its appearance from 1516 onwards, with the creation of the Jewish Ghetto. This was an event that would have repercussions throughout Europe, changing the course of history. Our research confirms that the exclusive use of one of the systems mentioned above (DB, GIS, BIM makes it possible to manage the complexity of the subject matter only partially. Consequently, it became necessary to analyse the possible interactions between such tools, so as to create a link between an alphanumeric DB and a geographical DB. The use of only GIS and BIM that provide for a 4D time management of objects turned out to be able to manage information and geometry in an effective and scalable way, providing a starting point for the mapping in depth of the historical analysis. Software products for digital modelling have changed in nature over time, going from simple viewing tools to simulation tools. The reconstruction of the time phases of the three Ghettos (Nuovo, Vecchio, and Nuovissimo and their visualisation through digital narratives of the history of that specific area of the city, for instance through videos, is making it possible for an increasing number of scholars and the general public to access the results of the study.

  17. A note on domains of discourse. Logical know-how for integrated environmental modelling

    Energy Technology Data Exchange (ETDEWEB)

    Gerstengarbe, F.W. (ed.); Jaeger, C.C.

    2003-10-01

    Building computer models means implementing a mathematical structure on a piece of hardware in such a way that insights about some other phenomenon can be gained, remembered and communicated. For meaningful computer modelling, the phenomenon to be modelled must be described in a logically coherent way. This can be quite difficult, especially when a combination of highly heterogeneous scientific disciplines is needed, as is often the case in environmental research. The paper shows how the notion of a domain of discourse as developed by logicians can be used to map out the cognitive landscape of integrated modelling. This landscape is not a fixed universe, but a multiverse resonating with an evolving pluralism of domains of discourse. Integrated modelling involves a never-ending activity of translation between such domains, an activity that often goes hand in hand with major efforts to overcome conceptual confusions within given domains. For these purposes, a careful use of mathematics, including tools of formal logic presented in the paper, can be helpful. The concept of vulnerability as currently used in global change research is discussed as an example of the challenges to be met in integrated environmental modelling. (orig.)

  18. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    Science.gov (United States)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  19. Multi -omics and metabolic modelling pipelines: challenges and tools for systems microbiology.

    Science.gov (United States)

    Fondi, Marco; Liò, Pietro

    2015-02-01

    Integrated -omics approaches are quickly spreading across microbiology research labs, leading to (i) the possibility of detecting previously hidden features of microbial cells like multi-scale spatial organization and (ii) tracing molecular components across multiple cellular functional states. This promises to reduce the knowledge gap between genotype and phenotype and poses new challenges for computational microbiologists. We underline how the capability to unravel the complexity of microbial life will strongly depend on the integration of the huge and diverse amount of information that can be derived today from -omics experiments. In this work, we present opportunities and challenges of multi -omics data integration in current systems biology pipelines. We here discuss which layers of biological information are important for biotechnological and clinical purposes, with a special focus on bacterial metabolism and modelling procedures. A general review of the most recent computational tools for performing large-scale datasets integration is also presented, together with a possible framework to guide the design of systems biology experiments by microbiologists. Copyright © 2015. Published by Elsevier GmbH.

  20. The Cryosphere Model Comparison Tool (CmCt): Ice Sheet Model Validation and Comparison Tool for Greenland and Antarctica

    Science.gov (United States)

    Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.

    2017-12-01

    The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.

  1. ACE-it: a tool for genome-wide integration of gene dosage and RNA expression data

    NARCIS (Netherlands)

    van Wieringen, W.N.; Belien, J.A.M.; Vosse, S.; Achame, E.M.; Ylstra, B.

    2006-01-01

    Summary: We describe a tool, called ACE-it (Array CGH Expression integration tool). ACE-it links the chromosomal position of the gene dosage measured by array CGH to the genes measured by the expression array. ACE-it uses this link to statistically test whether gene dosage affects RNA expression. ©

  2. DIDEM - An integrated model for comparative health damage costs calculation of air pollution

    Science.gov (United States)

    Ravina, Marco; Panepinto, Deborah; Zanetti, Maria Chiara

    2018-01-01

    Air pollution represents a continuous hazard to human health. Administration, companies and population need efficient indicators of the possible effects given by a change in decision, strategy or habit. The monetary quantification of health effects of air pollution through the definition of external costs is increasingly recognized as a useful indicator to support decision and information at all levels. The development of modelling tools for the calculation of external costs can provide support to analysts in the development of consistent and comparable assessments. In this paper, the DIATI Dispersion and Externalities Model (DIDEM) is presented. The DIDEM model calculates the delta-external costs of air pollution comparing two alternative emission scenarios. This tool integrates CALPUFF's advanced dispersion modelling with the latest WHO recommendations on concentration-response functions. The model is based on the impact pathway method. It was designed to work with a fine spatial resolution and a local or national geographic scope. The modular structure allows users to input their own data sets. The DIDEM model was tested on a real case study, represented by a comparative analysis of the district heating system in Turin, Italy. Additional advantages and drawbacks of the tool are discussed in the paper. A comparison with other existing models worldwide is reported.

  3. Developing a tool for mapping adult mental health care provision in Europe: the REMAST research protocol and its contribution to better integrated care

    Directory of Open Access Journals (Sweden)

    Luis Salvador-Carulla

    2015-12-01

    Full Text Available Introduction: Mental health care is a critical area to better understand integrated care and to pilot the different components of the integrated care model. However, there is an urgent need for better tools to compare and understand the context of integrated mental health care in Europe.Method: The REMAST tool (REFINEMENT MApping Services Tool combines a series of standardised health service research instruments and geographical information systems (GIS to develop local atlases of mental health care from the perspective of horizontal and vertical integrated care. It contains five main sections: (a Population Data; (b the Verona Socio-economic Status (SES Index; (c the Mental Health System Checklist; (d the Mental Health Services Inventory using the DESDE-LTC instrument; and (e Geographical Data.Expected results: The REMAST tool facilitates context analysis in mental health by providing the comparative rates of mental health service provision according to the availability of main types of care; care placement capacity; workforce capacity; and geographical accessibility to services in the local areas in eight study areas in Austria, England, Finland, France, Italy, Norway, Romania and Spain.Discussion: The outcomes of this project will facilitate cooperative work and knowledge transfer on mental health care to the different agencies involved in mental health planning and provision. This project would improve the information to users and society on the available resources for mental health care and system thinking at the local level by the different stakeholders. The techniques used in this project and the knowledge generated could eventually be transferred to the mapping of other fields of integrated care.

  4. Prediction of irradiation damage effects by multi-scale modelling: EURATOM 3 Framework integrated project perfect

    International Nuclear Information System (INIS)

    Massoud, J.P.; Bugat, St.; Marini, B.; Lidbury, D.; Van Dyck, St.; Debarberis, L.

    2008-01-01

    Full text of publication follows. In nuclear PWRs, materials undergo degradation due to severe irradiation conditions that may limit their operational life. Utilities operating these reactors must quantify the aging and the potential degradations of reactor pressure vessels and also of internal structures to ensure safe and reliable plant operation. The EURATOM 6. Framework Integrated Project PERFECT (Prediction of Irradiation Damage Effects in Reactor Components) addresses irradiation damage in RPV materials and components by multi-scale modelling. This state-of-the-art approach offers potential advantages over the conventional empirical methods used in current practice of nuclear plant lifetime management. Launched in January 2004, this 48-month project is focusing on two main components of nuclear power plants which are subject to irradiation damage: the ferritic steel reactor pressure vessel and the austenitic steel internals. This project is also an opportunity to integrate the fragmented research and experience that currently exists within Europe in the field of numerical simulation of radiation damage and creates the links with international organisations involved in similar projects throughout the world. Continuous progress in the physical understanding of the phenomena involved in irradiation damage and continuous progress in computer sciences make possible the development of multi-scale numerical tools able to simulate the effects of irradiation on materials microstructure. The consequences of irradiation on mechanical and corrosion properties of materials are also tentatively modelled using such multi-scale modelling. But it requires to develop different mechanistic models at different levels of physics and engineering and to extend the state of knowledge in several scientific fields. And the links between these different kinds of models are particularly delicate to deal with and need specific works. Practically the main objective of PERFECT is to build

  5. Developing Flexible, Integrated Hydrologic Modeling Systems for Multiscale Analysis in the Midwest and Great Lakes Region

    Science.gov (United States)

    Hamlet, A. F.; Chiu, C. M.; Sharma, A.; Byun, K.; Hanson, Z.

    2016-12-01

    Physically based hydrologic modeling of surface and groundwater resources that can be flexibly and efficiently applied to support water resources policy/planning/management decisions at a wide range of spatial and temporal scales are greatly needed in the Midwest, where stakeholder access to such tools is currently a fundamental barrier to basic climate change assessment and adaptation efforts, and also the co-production of useful products to support detailed decision making. Based on earlier pilot studies in the Pacific Northwest Region, we are currently assembling a suite of end-to-end tools and resources to support various kinds of water resources planning and management applications across the region. One of the key aspects of these integrated tools is that the user community can access gridded products at any point along the end-to-end chain of models, looking backwards in time about 100 years (1915-2015), and forwards in time about 85 years using CMIP5 climate model projections. The integrated model is composed of historical and projected future meteorological data based on station observations and statistical and dynamically downscaled climate model output respectively. These gridded meteorological data sets serve as forcing data for the macro-scale VIC hydrologic model implemented over the Midwest at 1/16 degree resolution. High-resolution climate model (4km WRF) output provides inputs for the analyses of urban impacts, hydrologic extremes, agricultural impacts, and impacts to the Great Lakes. Groundwater recharge estimated by the surface water model provides input data for fine-scale and macro-scale groundwater models needed for specific applications. To highlight the multi-scale use of the integrated models in support of co-production of scientific information for decision making, we briefly describe three current case studies addressing different spatial scales of analysis: 1) Effects of climate change on the water balance of the Great Lakes, 2) Future

  6. Developing integrated parametric planning models for budgeting and managing complex projects

    Science.gov (United States)

    Etnyre, Vance A.; Black, Ken U.

    1988-01-01

    The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.

  7. Topological quantum theories and integrable models

    International Nuclear Information System (INIS)

    Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.

    1991-01-01

    The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit

  8. ARC integration into the NEAMS Workbench

    Energy Technology Data Exchange (ETDEWEB)

    Stauff, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Gaughan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Kim, T. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-01-01

    One of the objectives of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Integration Product Line (IPL) is to facilitate the deployment of the high-fidelity codes developed within the program. The Workbench initiative was launched in FY-2017 by the IPL to facilitate the transition from conventional tools to high fidelity tools. The Workbench provides a common user interface for model creation, real-time validation, execution, output processing, and visualization for integrated codes.

  9. Integrated modeling and characterization of local crack chemistry

    International Nuclear Information System (INIS)

    Savchik, J.A.; Burke, M.S.

    1996-01-01

    The MULTEQ computer program has become an industry wide tool which can be used to calculate the chemical composition in a flow occluded region as the solution within concentrates due to a local boiling process. These results can be used to assess corrosion concerns in plant equipment such as steam generators. Corrosion modeling attempts to quantify corrosion assessments by accounting for the mass transport processes involved in the corrosion mechanism. MULTEQ has played an ever increasing role in defining the local chemistry for such corrosion models. This paper will outline how the integration of corrosion modeling with the analysis of corrosion films and deposits can lead to the development of a useful modeling tool, wherein MULTEQ is interactively linked to a diffusion and migration transport process. This would provide a capability to make detailed inferences of the local crack chemistry based on the analyses of the local corrosion films and deposits inside a crack and thus provide guidance for chemical fixes to avoid cracking. This methodology is demonstrated for a simple example of a cracked tube. This application points out the utility of coupling MULTEQ with a mass transport process and the feasibility of an option in a future version of MULTEQ that would permit relating film and deposit analyses to the local chemical environment. This would increase the amount of information obtained from removed tube analyses and laboratory testing that can contribute to an overall program for mitigating tubing and crevice corrosion

  10. Integrated modeling and characterization of local crack chemistry

    International Nuclear Information System (INIS)

    Savchik, J.A.; Burke, M.S.

    1995-01-01

    The MULTEQ computer program has become an industry wide tool which can be used to calculate the chemical composition in a flow occluded region as the solution within concentrates due to a local boiling process. These results can be used to assess corrosion concerns in plant equipment such as steam generators. Corrosion modeling attempts to quantify corrosion assessments by accounting for the mass transport processes involved in the corrosion mechanism. MULTEQ has played an ever increasing role in defining the local chemistry for such corrosion models. This paper will outline how the integration of corrosion modeling with the analysis of corrosion films and deposits can lead to the development of a useful modeling tool, wherein MULTEQ is interactively linked to a diffusion and migration transport process. This would provide a capability to make detailed inferences of the local crack chemistry based on the analyses of the local corrosion films and deposits inside a crack and thus provide guidance for chemical fixes to avoid cracking. This methodology is demonstrated for a simple example of a cracked tube. This application points out the utility of coupling MULTEQ with a mass transport process and the feasibility of an option in a future version of MULTEQ that would permit relating film and deposit analyses to the local chemical environment. This would increase the amount of information obtained from removed tube analyses and laboratory testing that can contribute to an overall program for mitigating tubing and crevice corrosion

  11. Designing tools for oil exploration using nuclear modeling

    Science.gov (United States)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  12. Business and technology integrated model

    OpenAIRE

    Noce, Irapuan; Carvalho, João Álvaro

    2011-01-01

    There is a growing interest in business modeling and architecture in the areas of management and information systems. One of the issues in the area is the lack of integration between the modeling techniques that are employed to support business development and those used for technology modeling. This paper proposes a modeling approach that is capable of integrating the modeling of the business and of the technology. By depicting the business model, the organization structure and the technolog...

  13. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  14. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  15. Integrated Model to Assess Cloud Deployment Effectiveness When Developing an IT-strategy

    Science.gov (United States)

    Razumnikov, S.; Prankevich, D.

    2016-04-01

    Developing an IT-strategy of cloud deployment is a complex issue since even the stage of its formation necessitates revealing what applications will be the best possible to meet the requirements of a company business-strategy, evaluate reliability and safety of cloud providers and analyze staff satisfaction. A system of criteria, as well an integrated model to assess cloud deployment effectiveness is offered. The model makes it possible to identify what applications being at the disposal of a company, as well as new tools to be deployed are reliable and safe enough for implementation in the cloud environment. The data on practical use of the procedure to assess cloud deployment effectiveness by a provider of telecommunication services is presented. The model was used to calculate values of integral indexes of services to be assessed, then, ones, meeting the criteria and answering the business-strategy of a company, were selected.

  16. atBioNet– an integrated network analysis tool for genomics and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Ding Yijun

    2012-07-01

    Full Text Available Abstract Background Large amounts of mammalian protein-protein interaction (PPI data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. Results atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks. The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. Conclusion atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools

  17. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    Science.gov (United States)

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  18. Data Integration Tool: Permafrost Data Debugging

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Pulsifer, P. L.; Strawhacker, C.; Yarmey, L.; Basak, R.

    2017-12-01

    We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the Global Terrestrial Network-Permafrost (GTN-P). The United States National Science Foundation funded this project through the National Snow and Ice Data Center (NSIDC) with the GTN-P to improve permafrost data access and discovery. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets (https://github.com/PermaData/DIT). Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs, incrementally interact with and evolve the widget workflows, and save those workflows for reproducibility. Taking ideas from visual programming found in the art and design domain, debugging and iterative design principles from software engineering, and the scientific data processing and analysis power of Fortran and Python it was written for interactive, iterative data manipulation, quality control, processing, and analysis of inconsistent data in an easily installable application. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.

  19. INTEGRATED CORPORATE STRATEGY MODEL

    Directory of Open Access Journals (Sweden)

    CATALINA SORIANA SITNIKOV

    2014-02-01

    Full Text Available Corporations are at present operating in demanding and highly unsure periods, facing a mixture of increased macroeconomic need, competitive and capital market dangers, and in many cases, the prospect for significant technical and regulative gap. Throughout these demanding and highly unsure times, the corporations must pay particular attention to corporate strategy. In present times, corporate strategy must be perceived and used as a function of various fields, covers, and characters as well as a highly interactive system. For the corporation's strategy to become a competitive advantage is necessary to understand and also to integrate it in a holistic model to ensure sustainable progress of corporation activities under the optimum conditions of profitability. The model proposed in this paper is aimed at integrating the two strategic models, Hoshin Kanri and Integrated Strategy Model, as well as their consolidation with the principles of sound corporate governance set out by the OECD.

  20. Effect of cutting fluids and cutting conditions on surface integrity and tool wear in turning of Inconel 713C

    Science.gov (United States)

    Hikiji, R.

    2018-01-01

    The trend toward downsizing of engines helps to increase the number of turbochargers around Europe. As for the turbocharger, the temperature of the exhaust gas is so high that the parts made of nickel base super alloy Inconel 713C are used as high temperature strength metals. External turning of Inconel 713C which is used as the actual automotive parts was carried out. The effect of the cutting fluids and cutting conditions on the surface integrity and tool wear was investigated, considering global environment and cost performance. As a result, in the range of the cutting conditions used this time, when the depth of cut was small, the good surface integrity and tool life were obtained. However, in the case of the large corner radius, it was found that the more the cutting length increased, the more the tool wear increased. When the cutting length is so large, the surface integrity and tool life got worse. As for the cutting fluids, it was found that the synthetic type showed better performance in the surface integrity and tool life than the conventional emulsion. However, it was clear that the large corner radius made the surface roughness and tool life good, but it affected the size error etc. in machining the workpiece held in a cantilever style.

  1. Exploration Medical System Trade Study Tools Overview

    Science.gov (United States)

    Mindock, J.; Myers, J.; Latorella, K.; Cerro, J.; Hanson, A.; Hailey, M.; Middour, C.

    2018-01-01

    ExMC is creating an ecosystem of tools to enable well-informed medical system trade studies. The suite of tools address important system implementation aspects of the space medical capabilities trade space and are being built using knowledge from the medical community regarding the unique aspects of space flight. Two integrating models, a systems engineering model and a medical risk analysis model, tie the tools together to produce an integrated assessment of the medical system and its ability to achieve medical system target requirements. This presentation will provide an overview of the various tools that are a part of the tool ecosystem. Initially, the presentation's focus will address the tools that supply the foundational information to the ecosystem. Specifically, the talk will describe how information that describes how medicine will be practiced is captured and categorized for efficient utilization in the tool suite. For example, the talk will include capturing what conditions will be planned for in-mission treatment, planned medical activities (e.g., periodic physical exam), required medical capabilities (e.g., provide imaging), and options to implement the capabilities (e.g., an ultrasound device). Database storage and configuration management will also be discussed. The presentation will include an overview of how these information tools will be tied to parameters in a Systems Modeling Language (SysML) model, allowing traceability to system behavioral, structural, and requirements content. The discussion will also describe an HRP-led enhanced risk assessment model developed to provide quantitative insight into each capability's contribution to mission success. Key outputs from these various tools, to be shared with the space medical and exploration mission development communities, will be assessments of medical system implementation option satisfaction of requirements and per-capability contributions toward achieving requirements.

  2. Modelling raw water quality: development of a drinking water management tool.

    Science.gov (United States)

    Kübeck, Ch; van Berk, W; Bergmann, A

    2009-01-01

    Ensuring future drinking water supply requires a tough management of groundwater resources. However, recent practices of economic resource control often does not involve aspects of the hydrogeochemical and geohydraulical groundwater system. In respect of analysing the available quantity and quality of future raw water, an effective resource management requires a full understanding of the hydrogeochemical and geohydraulical processes within the aquifer. For example, the knowledge of raw water quality development within the time helps to work out strategies of water treatment as well as planning finance resources. On the other hand, the effectiveness of planed measurements reducing the infiltration of harmful substances such as nitrate can be checked and optimized by using hydrogeochemical modelling. Thus, within the framework of the InnoNet program funded by Federal Ministry of Economics and Technology, a network of research institutes and water suppliers work in close cooperation developing a planning and management tool particularly oriented on water management problems. The tool involves an innovative material flux model that calculates the hydrogeochemical processes under consideration of the dynamics in agricultural land use. The program integrated graphical data evaluation is aligned on the needs of water suppliers.

  3. An Integrative Review of Pediatric Fall Risk Assessment Tools.

    Science.gov (United States)

    DiGerolamo, Kimberly; Davis, Katherine Finn

    Patient fall prevention begins with accurate risk assessment. However, sustained improvements in prevention and quality of care include use of validated fall risk assessment tools (FRATs). The goal of FRATs is to identify patients at highest risk. Adult FRATs are often borrowed from to create tools for pediatric patients. Though factors associated with pediatric falls in the hospital setting are similar to those in adults, such as mobility, medication use, and cognitive impairment, adult FRATs and the factors associated with them do not adequately assess risk in children. Articles were limited to English language, ages 0-21years, and publish date 2006-2015. The search yielded 22 articles. Ten were excluded as the population was primarily adult or lacked discussion of a FRAT. Critical appraisal and findings were synthesized using the Johns Hopkins Nursing evidence appraisal system. Twelve articles relevant to fall prevention in the pediatric hospital setting that discussed fall risk assessment and use of a FRAT were reviewed. Comparison between and accuracy of FRATs is challenged when different classifications, definitions, risk stratification, and inclusion criteria are used. Though there are several pediatric FRATs published in the literature, none have been found to be reliable and valid across institutions and diverse populations. This integrative review highlights the importance of choosing a FRAT based on an institution's identified risk factors and validating the tool for one's own patient population as well as using the tool in conjunction with nursing clinical judgment to guide interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Sensitivity of an Integrated Mesoscale Atmosphere and Agriculture Land Modeling System (WRF/CMAQ-EPIC) to MODIS Vegetation and Lightning Assimilation

    Science.gov (United States)

    Ran, L.; Cooter, E. J.; Gilliam, R. C.; Foroutan, H.; Kang, D.; Appel, W.; Wong, D. C.; Pleim, J. E.; Benson, V.; Pouliot, G.

    2017-12-01

    The combined meteorology and air quality modeling system composed of the Weather Research and Forecast (WRF) model and Community Multiscale Air Quality (CMAQ) model is an important decision support tool that is used in research and regulatory decisions related to emissions, meteorology, climate, and chemical transport. The Environmental Policy Integrated Climate (EPIC) is a cropping model which has long been used in a range of applications related to soil erosion, crop productivity, climate change, and water quality around the world. We have integrated WRF/CMAQ with EPIC using the Fertilizer Emission Scenario Tool for CMAQ (FEST-C) to estimate daily soil N information with fertilization for CMAQ bi-directional ammonia flux modeling. Driven by the weather and N deposition from WRF/CMAQ, FEST-C EPIC simulations are conducted on 22 different agricultural production systems ranging from managed grass lands (e.g. hay and alfalfa) to crop lands (e.g. corn grain and soybean) with rainfed and irrigated information across any defined conterminous United States (U.S.) CMAQ domain and grid resolution. In recent years, this integrated system has been enhanced and applied in many different air quality and ecosystem assessment projects related to land-water-atmosphere interactions. These enhancements have advanced this system to become a valuable tool for integrated assessments of air, land and water quality in light of social drivers and human and ecological outcomes. This presentation will focus on evaluating the sensitivity of precipitation and N deposition in the integrated system to MODIS vegetation input and lightning assimilation and their impacts on agricultural production and fertilization. We will describe the integrated modeling system and evaluate simulated precipitation and N deposition along with other weather information (e.g. temperature, humidity) for 2011 over the conterminous U.S. at 12 km grids from a coupled WRF/CMAQ with MODIS and lightning assimilation

  5. Subtask 2.4 - Integration and Synthesis in Climate Change Predictive Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Jaroslav Solc

    2009-06-01

    The Energy & Environmental Research Center (EERC) completed a brief evaluation of the existing status of predictive modeling to assess options for integration of our previous paleohydrologic reconstructions and their synthesis with current global climate scenarios. Results of our research indicate that short-term data series available from modern instrumental records are not sufficient to reconstruct past hydrologic events or predict future ones. On the contrary, reconstruction of paleoclimate phenomena provided credible information on past climate cycles and confirmed their integration in the context of regional climate history is possible. Similarly to ice cores and other paleo proxies, acquired data represent an objective, credible tool for model calibration and validation of currently observed trends. It remains a subject of future research whether further refinement of our results and synthesis with regional and global climate observations could contribute to improvement and credibility of climate predictions on a regional and global scale.

  6. Electronic Dictionary as a Tool for Integration of Additional Learning Content

    Directory of Open Access Journals (Sweden)

    Stefka Kovacheva

    2015-12-01

    Full Text Available Electronic Dictionary as a Tool for Integration of Additional Learning Content This article discusses electronic dictionary as an element of the „Bulgarian cultural and historical heritage under the protection of UNESCO” database developed in IMI (BAS, that will be used to integrate additional learning content. The electronic dictionary is described as an easily accessible book of reference, offering information to the shape, meaning, usage and the origin of words in connection to the cultural-historical heritage sites in Bulgaria, protected by UNESCO. The dictionary targets 9–11 year old students from Bulgarian schools, who study the subjects “Man and Society” in 4th grade and “History and Civilization” in 5th grade.

  7. Sobol Sensitivity Analysis: A Tool to Guide the Development and Evaluation of Systems Pharmacology Models

    Science.gov (United States)

    Trame, MN; Lesko, LJ

    2015-01-01

    A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015 PMID:27548289

  8. Multi-objective reverse logistics model for integrated computer waste management.

    Science.gov (United States)

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  9. Forward modelling of multi-component induction logging tools in layered anisotropic dipping formations

    International Nuclear Information System (INIS)

    Gao, Jie; Xu, Chenhao; Xiao, Jiaqi

    2013-01-01

    Multi-component induction logging provides great assistance in the exploration of thinly laminated reservoirs. The 1D parametric inversion following an adaptive borehole correction is the key step in the data processing of multi-component induction logging responses. To make the inversion process reasonably fast, an efficient forward modelling method is necessary. In this paper, a modelling method has been developed to simulate the multi-component induction tools in deviated wells drilled in layered anisotropic formations. With the introduction of generalized reflection coefficients, the analytic expressions of magnetic field in the form of a Sommerfeld integral were derived. The fast numerical computation of the integral has been completed by using the fast Fourier–Hankel transform and fast Hankel transform methods. The latter is so time efficient that it is competent enough for real-time multi-parameter inversion. In this paper, some simulated results have been presented and they are in excellent agreement with the finite difference method code's solution. (paper)

  10. Application of a faith-based integration tool to assess mental and physical health interventions.

    Science.gov (United States)

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  11. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  12. MatchingTools: A Python library for symbolic effective field theory calculations

    Science.gov (United States)

    Criado, Juan C.

    2018-06-01

    MatchingTools is a Python library for doing symbolic calculations in effective field theory. It provides the tools to construct general models by defining their field content and their interaction Lagrangian. Once a model is given, the heavy particles can be integrated out at the tree level to obtain an effective Lagrangian in which only the light particles appear. After integration, some of the terms of the resulting Lagrangian might not be independent. MatchingTools contains functions for transforming these terms to rewrite them in terms of any chosen set of operators.

  13. An artificial intelligence tool for complex age-depth models

    Science.gov (United States)

    Bradley, E.; Anderson, K. A.; de Vesine, L. R.; Lai, V.; Thomas, M.; Nelson, T. H.; Weiss, I.; White, J. W. C.

    2017-12-01

    CSciBox is an integrated software system for age modeling of paleoenvironmental records. It incorporates an array of data-processing and visualization facilities, ranging from 14C calibrations to sophisticated interpolation tools. Using CSciBox's GUI, a scientist can build custom analysis pipelines by composing these built-in components or adding new ones. Alternatively, she can employ CSciBox's automated reasoning engine, Hobbes, which uses AI techniques to perform an in-depth, autonomous exploration of the space of possible age-depth models and presents the results—both the models and the reasoning that was used in constructing and evaluating them—to the user for her inspection. Hobbes accomplishes this using a rulebase that captures the knowledge of expert geoscientists, which was collected over the course of more than 100 hours of interviews. It works by using these rules to generate arguments for and against different age-depth model choices for a given core. Given a marine-sediment record containing uncalibrated 14C dates, for instance, Hobbes tries CALIB-style calibrations using a choice of IntCal curves, with reservoir age correction values chosen from the 14CHRONO database using the lat/long information provided with the core, and finally composes the resulting age points into a full age model using different interpolation methods. It evaluates each model—e.g., looking for outliers or reversals—and uses that information to guide the next steps of its exploration, and presents the results to the user in human-readable form. The most powerful of CSciBox's built-in interpolation methods is BACON, a Bayesian sedimentation-rate algorithm—a powerful but complex tool that can be difficult to use. Hobbes adjusts BACON's many parameters autonomously to match the age model to the expectations of expert geoscientists, as captured in its rulebase. It then checks the model against the data and iteratively re-calculates until it is a good fit to the data.

  14. OISI dynamic end-to-end modeling tool

    Science.gov (United States)

    Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo

    2000-07-01

    The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.

  15. The inherent dangers of using computable general equilibrium models as a single integrated modelling framework for sustainability impact assessment. A critical note on Boehringer and Loeschel (2006)

    International Nuclear Information System (INIS)

    Scrieciu, S. Serban

    2007-01-01

    The search for methods of assessment that best evaluate and integrate the trade-offs and interactions between the economic, environmental and social components of development has been receiving a new impetus due to the requirement that sustainability concerns be incorporated into the policy formulation process. A paper forthcoming in Ecological Economics (Boehringer, C., Loeschel, A., in press. Computable general equilibrium models for sustainability impact assessment: status quo and prospects, Ecological Economics.) claims that Computable General Equilibrium (CGE) models may potentially represent the much needed 'back-bone' tool to carry out reliable integrated quantitative Sustainability Impact Assessments (SIAs). While acknowledging the usefulness of CGE models for some dimensions of SIA, this commentary questions the legitimacy of employing this particular economic modelling tool as a single integrating modelling framework for a comprehensive evaluation of the multi-dimensional, dynamic and complex interactions between policy and sustainability. It discusses several inherent dangers associated with the advocated prospects for the CGE modelling approach to contribute to comprehensive and reliable sustainability impact assessments. The paper warns that this reductionist viewpoint may seriously infringe upon the basic values underpinning the SIA process, namely a transparent, heterogeneous, balanced, inter-disciplinary, consultative and participatory take to policy evaluation and building of the evidence-base. (author)

  16. The Global Nuclear Futures Model: A Dynamic Simulation Tool for Energy Strategies

    International Nuclear Information System (INIS)

    Bixler, N.E.

    2002-01-01

    The Global Nuclear Futures Model (GNFM) is a dynamic simulation tool that provides an integrated framework to model key aspects of nuclear energy, nuclear materials storage and disposition, global nuclear materials management, and nuclear proliferation risk. It links nuclear energy and other energy shares dynamically to greenhouse gas emissions and twelve other measures of environmental impact. It presents historical data from 1990 to 2000 and extrapolates energy demand through the year 2050. More specifically, it contains separate modules for energy, the nuclear fuel cycle front end, the nuclear fuel cycle back end, defense nuclear materials, environmental impacts, and measures of the potential for nuclear proliferation. It is globally integrated but also breaks out five regions of the world so that environmental impacts and nuclear proliferation concerns can be evaluated on a regional basis. The five regions are the United States of America (USA), The Peoples Republic of China (China), the former Soviet Union (FSU), the OECD nations excluding the USA, and the rest of the world (ROW). (author)

  17. Following a drop of water from the cloud, throughout the sewer system, into the receiving water - Model predictive control of integrated sewer-wastewater treatment systems

    DEFF Research Database (Denmark)

    Mikkelsen, Peter Steen; Vezzaro, Luca; Sharma, Anitha Kumari

    This article presents selected examples of model-based prediction and control of integrated sewer-wastewater treatment systems, developed within the framework of the Storm- and Wastewater Informatics project (SWI). By exploiting all the available on-line information (e.g. radar based rainfall...... of pollutants discharged from treatment plants, etc.). The tools developed in the SWI project include (but are not limited to (i) rainfall nowcasting based on radar measurements, (ii) probabilistic flow forecasting based on data assimilation and stochastic models, (iii) prediction and optimization of wet......-weather performance of wastewater treatment plants, and (iv) integrated control of the different elements of the integrated wastewater systems. Full-scale testing of these tools in different catchment located in Denmark ensure that the developed tools can represent an important step forwards for on-line operation...

  18. Scheduling Model for Renewable Energy Sources Integration in an Insular Power System

    Directory of Open Access Journals (Sweden)

    Gerardo J. Osório

    2018-01-01

    Full Text Available Insular power systems represent an asset and an excellent starting point for the development and analysis of innovative tools and technologies. The integration of renewable energy resources that has taken place in several islands in the south of Europe, particularly in Portugal, has brought more uncertainty to production management. In this work, an innovative scheduling model is proposed, which considers the integration of wind and solar resources in an insular power system in Portugal, with a strong conventional generation basis. This study aims to show the benefits of increasing the integration of renewable energy resources in this insular power system, and the objectives are related to minimizing the time for which conventional generation is in operation, maximizing profits, reducing production costs, and consequently, reducing greenhouse gas emissions.

  19. Developing an Integrated Model Framework for the Assessment of Sustainable Agricultural Residue Removal Limits for Bioenergy Systems

    Energy Technology Data Exchange (ETDEWEB)

    David Muth, Jr.; Jared Abodeely; Richard Nelson; Douglas McCorkle; Joshua Koch; Kenneth Bryden

    2011-08-01

    Agricultural residues have significant potential as a feedstock for bioenergy production, but removing these residues can have negative impacts on soil health. Models and datasets that can support decisions about sustainable agricultural residue removal are available; however, no tools currently exist capable of simultaneously addressing all environmental factors that can limit availability of residue. The VE-Suite model integration framework has been used to couple a set of environmental process models to support agricultural residue removal decisions. The RUSLE2, WEPS, and Soil Conditioning Index models have been integrated. A disparate set of databases providing the soils, climate, and management practice data required to run these models have also been integrated. The integrated system has been demonstrated for two example cases. First, an assessment using high spatial fidelity crop yield data has been run for a single farm. This analysis shows the significant variance in sustainably accessible residue across a single farm and crop year. A second example is an aggregate assessment of agricultural residues available in the state of Iowa. This implementation of the integrated systems model demonstrates the capability to run a vast range of scenarios required to represent a large geographic region.

  20. Modelling Spark Integration in Science Classroom

    Directory of Open Access Journals (Sweden)

    Marie Paz E. Morales

    2014-02-01

    Full Text Available The study critically explored how a PASCO-designed technology (SPARK ScienceLearning System is meaningfully integrated into the teaching of selected topics in Earth and Environmental Science. It highlights on modelling the effectiveness of using the SPARK Learning System as a primary tool in learning science that leads to learning and achievement of the students. Data and observation gathered and correlation of the ability of the technology to develop high intrinsic motivation to student achievement were used to design framework on how to meaningfully integrate SPARK ScienceLearning System in teaching Earth and Environmental Science. Research instruments used in this study were adopted from standardized questionnaires available from literature. Achievement test and evaluation form were developed and validated for the purpose of deducing data needed for the study. Interviews were done to delve into the deeper thoughts and emotions of the respondents. Data from the interviews served to validate all numerical data culled from this study. Cross-case analysis of the data was done to reveal some recurring themes, problems and benefits derived by the students in using the SPARK Science Learning System to further establish its effectiveness in the curriculum as a forerunner to the shift towards the 21st Century Learning.

  1. Integrated Modeling of Complex Optomechanical Systems

    Science.gov (United States)

    Andersen, Torben; Enmark, Anita

    2011-09-01

    Mathematical modeling and performance simulation are playing an increasing role in large, high-technology projects. There are two reasons; first, projects are now larger than they were before, and the high cost calls for detailed performance prediction before construction. Second, in particular for space-related designs, it is often difficult to test systems under realistic conditions beforehand, and mathematical modeling is then needed to verify in advance that a system will work as planned. Computers have become much more powerful, permitting calculations that were not possible before. At the same time mathematical tools have been further developed and found acceptance in the community. Particular progress has been made in the fields of structural mechanics, optics and control engineering, where new methods have gained importance over the last few decades. Also, methods for combining optical, structural and control system models into global models have found widespread use. Such combined models are usually called integrated models and were the subject of this symposium. The objective was to bring together people working in the fields of groundbased optical telescopes, ground-based radio telescopes, and space telescopes. We succeeded in doing so and had 39 interesting presentations and many fruitful discussions during coffee and lunch breaks and social arrangements. We are grateful that so many top ranked specialists found their way to Kiruna and we believe that these proceedings will prove valuable during much future work.

  2. Integrating the simulation of domestic water demand behaviour to an urban water model using agent based modelling

    Science.gov (United States)

    Koutiva, Ifigeneia; Makropoulos, Christos

    2015-04-01

    The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model

  3. Using Modeling Tools to Better Understand Permafrost Hydrology

    Directory of Open Access Journals (Sweden)

    Clément Fabre

    2017-06-01

    Full Text Available Modification of the hydrological cycle and, subsequently, of other global cycles is expected in Arctic watersheds owing to global change. Future climate scenarios imply widespread permafrost degradation caused by an increase in air temperature, and the expected effect on permafrost hydrology is immense. This study aims at analyzing, and quantifying the daily water transfer in the largest Arctic river system, the Yenisei River in central Siberia, Russia, partially underlain by permafrost. The semi-distributed SWAT (Soil and Water Assessment Tool hydrological model has been calibrated and validated at a daily time step in historical discharge simulations for the 2003–2014 period. The model parameters have been adjusted to embrace the hydrological features of permafrost. SWAT is shown capable to estimate water fluxes at a daily time step, especially during unfrozen periods, once are considered specific climatic and soils conditions adapted to a permafrost watershed. The model simulates average annual contribution to runoff of 263 millimeters per year (mm yr−1 distributed as 152 mm yr−1 (58% of surface runoff, 103 mm yr−1 (39% of lateral flow and 8 mm yr−1 (3% of return flow from the aquifer. These results are integrated on a reduced basin area downstream from large dams and are closer to observations than previous modeling exercises.

  4. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 3: HARP Graphics Oriented (GO) input user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.

  5. Use of CAPE-OPEN standards in the interoperability between modelling tools (MoT) and process simulators (ProSim)

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul; Déchelotte, Stéphane

    2008-01-01

    Computer-aided design, analysis and/or operation of chemical products and processes that manufacture them require a number of computational tools. As these tools may come from different sources and disciplines, an important issue is how they can be used simultaneously and efficiently for the design...... computational tools according to problem specific work-flows/data-flows. The reliability of the integration of different tools is illustrated through two case studies. In case study 1, the tools Simulis® Thermodynamics (PME) and ICAS-MoT (PMC) are combined for the calculation of thermodynamic properties through......SimPlus-ICAS-MoT-COFE interoperability is also carried out successfully to proof the interoperability of the different computational entities. Furthermore, the introduction of the multiscale modelling concept and its application through the CAPE-OPEN standards is highlighted....

  6. Analysis of metabolomic data: tools, current strategies and future challenges for omics data integration.

    Science.gov (United States)

    Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco

    2017-05-01

    Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Model Checking Markov Chains: Techniques and Tools

    NARCIS (Netherlands)

    Zapreev, I.S.

    2008-01-01

    This dissertation deals with four important aspects of model checking Markov chains: the development of efficient model-checking tools, the improvement of model-checking algorithms, the efficiency of the state-space reduction techniques, and the development of simulation-based model-checking

  8. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  9. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    Science.gov (United States)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be

  10. Integrated Inflammatory Stress (ITIS) Model

    DEFF Research Database (Denmark)

    Bangsgaard, Elisabeth O.; Hjorth, Poul G.; Olufsen, Mette S.

    2017-01-01

    maintains a long-term level of the stress hormone cortisol which is also anti-inflammatory. A new integrated model of the interaction between these two subsystems of the inflammatory system is proposed and coined the integrated inflammatory stress (ITIS) model. The coupling mechanisms describing....... A constant activation results in elevated levels of the variables in the model while a prolonged change of the oscillations in ACTH and cortisol concentrations is the most pronounced result of different LPS doses predicted by the model....

  11. Non-integrable quantum field theories as perturbations of certain integrable models

    International Nuclear Information System (INIS)

    Delfino, G.; Simonetti, P.

    1996-03-01

    We approach the study of non-integrable models of two-dimensional quantum field theory as perturbations of the integrable ones. By exploiting the knowledge of the exact S-matrix and Form Factors of the integrable field theories we obtain the first order corrections to the mass ratios, the vacuum energy density and the S-matrix of the non-integrable theories. As interesting applications of the formalism, we study the scaling region of the Ising model in an external magnetic field at T ∼ T c and the scaling region around the minimal model M 2 , τ . For these models, a remarkable agreement is observed between the theoretical predictions and the data extracted by a numerical diagonalization of their Hamiltonian. (author). 41 refs, 9 figs, 1 tab

  12. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  13. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  14. A Conceptual Framework for Integration of Evidence-Based Design with Lighting Simulation Tools

    Directory of Open Access Journals (Sweden)

    Anahita Davoodi

    2017-09-01

    Full Text Available The use of lighting simulation tools has been growing over the past years which has improved lighting analysis. While computer simulations have proven to be a viable tool for analyzing lighting in physical environments, they have difficulty in assessing the effects of light on occupant’s perception. Evidence-based design (EBD is a design method that is gaining traction in building design due to its strength in providing means to assess the effects of built environments on humans. The aim of this study was to develop a conceptual framework for integrating EBD with lighting simulation tools. Based on a literature review, it was investigated how EBD and lighting simulation can be combined to provide a holistic lighting performance evaluation method. The results show that they can mutually benefit from each other. EBD makes it possible to evaluate and/or improve performance metrics by utilizing user feedback. On the other hand, performance metrics can be used for a better description of evidence, and to analyze the effects of lighting with more details. The results also show that EBD can be used to evaluate light simulations to better understand when and how they should be performed. A framework is presented for integration of lighting simulation and EBD.

  15. Smart systems integration and simulation

    CERN Document Server

    Poncino, Massimo; Pravadelli, Graziano

    2016-01-01

    This book-presents new methods and tools for the integration and simulation of smart devices. The design approach described in this book explicitly accounts for integration of Smart Systems components and subsystems as a specific constraint. It includes methodologies and EDA tools to enable multi-disciplinary and multi-scale modeling and design, simulation of multi-domain systems, subsystems and components at all levels of abstraction, system integration and exploration for optimization of functional and non-functional metrics. By covering theoretical and practical aspects of smart device design, this book targets people who are working and studying on hardware/software modelling, component integration and simulation under different positions (system integrators, designers, developers, researchers, teachers, students etc.). In particular, it is a good introduction to people who have interest in managing heterogeneous components in an efficient and effective way on different domains and different abstraction l...

  16. Synthesizing models useful for ecohydrology and ecohydraulic approaches: An emphasis on integrating models to address complex research questions

    Science.gov (United States)

    Brewer, Shannon K.; Worthington, Thomas; Mollenhauer, Robert; Stewart, David; McManamay, Ryan; Guertault, Lucie; Moore, Desiree

    2018-01-01

    Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio‐economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models, 43 were commonly applied due to their versatility, accessibility, user‐friendliness, and excellent user‐support. Forty‐one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user‐support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user‐friendly forms, increasing user‐support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Nonetheless, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and

  17. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  18. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  19. Modeling the cosmic-ray-induced soft-error rate in integrated circuits: An overview

    International Nuclear Information System (INIS)

    Srinivasan, G.R.

    1996-01-01

    This paper is an overview of the concepts and methodologies used to predict soft-error rates (SER) due to cosmic and high-energy particle radiation in integrated circuit chips. The paper emphasizes the need for the SER simulation using the actual chip circuit model which includes device, process, and technology parameters as opposed to using either the discrete device simulation or generic circuit simulation that is commonly employed in SER modeling. Concepts such as funneling, event-by-event simulation, nuclear history files, critical charge, and charge sharing are examined. Also discussed are the relative importance of elastic and inelastic nuclear collisions, rare event statistics, and device vs. circuit simulations. The semi-empirical methodologies used in the aerospace community to arrive at SERs [also referred to as single-event upset (SEU) rates] in integrated circuit chips are reviewed. This paper is one of four in this special issue relating to SER modeling. Together, they provide a comprehensive account of this modeling effort, which has resulted in a unique modeling tool called the Soft-Error Monte Carlo Model, or SEMM

  20. Integrating microbial diversity in soil carbon dynamic models parameters

    Science.gov (United States)

    Louis, Benjamin; Menasseri-Aubry, Safya; Leterme, Philippe; Maron, Pierre-Alain; Viaud, Valérie

    2015-04-01

    sampling time in order to follow the dynamic of residue and soil organic matter mineralization. Diversity, structure and composition of microbial communities have been characterized before incubation time. The dynamic of carbon fluxes through CO2 emissions has been modelled through a simple model. Using statistical tools, relations between parameters of the model and microbial diversity indexes and/or pedological characteristics have been developed and integrated to the model. First results show that global diversity has an impact on the models parameters. Moreover, larger fungi diversity seems to lead to larger parameters representing decomposition rates and/or carbon use efficiencies than bacterial diversity. Classically, pedological factors such as soil pH and texture must also be taken into account.

  1. A comparison of tools for modeling freshwater ecosystem services.

    Science.gov (United States)

    Vigerstol, Kari L; Aukema, Juliann E

    2011-10-01

    Interest in ecosystem services has grown tremendously among a wide range of sectors, including government agencies, NGO's and the business community. Ecosystem services entailing freshwater (e.g. flood control, the provision of hydropower, and water supply), as well as carbon storage and sequestration, have received the greatest attention in both scientific and on-the-ground applications. Given the newness of the field and the variety of tools for predicting water-based services, it is difficult to know which tools to use for different questions. There are two types of freshwater-related tools--traditional hydrologic tools and newer ecosystem services tools. Here we review two of the most prominent tools of each type and their possible applications. In particular, we compare the data requirements, ease of use, questions addressed, and interpretability of results among the models. We discuss the strengths, challenges and most appropriate applications of the different models. Traditional hydrological tools provide more detail whereas ecosystem services tools tend to be more accessible to non-experts and can provide a good general picture of these ecosystem services. We also suggest gaps in the modeling toolbox that would provide the greatest advances by improving existing tools. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Integrating Wikis as Educational Tools for the Development of a Community of Inquiry

    Science.gov (United States)

    Eteokleous, Nikleia; Ktoridou, Despo; Orphanou, Maria

    2014-01-01

    This article describes a study that attempted to evaluate the integration of wikis as an educational tool in successfully achieving the learning objectives of a fifth-grade linguistics and literature course. A mixed-method approach was employed--data were collected via questionnaires, reflective journals, observations, and interviews. The results…

  3. Development of a physically-based planar inductors VHDL-AMS model for integrated power converter design

    Science.gov (United States)

    Ammouri, Aymen; Ben Salah, Walid; Khachroumi, Sofiane; Ben Salah, Tarek; Kourda, Ferid; Morel, Hervé

    2014-05-01

    Design of integrated power converters needs prototype-less approaches. Specific simulations are required for investigation and validation process. Simulation relies on active and passive device models. Models of planar devices, for instance, are still not available in power simulator tools. There is, thus, a specific limitation during the simulation process of integrated power systems. The paper focuses on the development of a physically-based planar inductor model and its validation inside a power converter during transient switching. The planar inductor model remains a complex device to model, particularly when the skin, the proximity and the parasitic capacitances effects are taken into account. Heterogeneous simulation scheme, including circuit and device models, is successfully implemented in VHDL-AMS language and simulated in Simplorer platform. The mixed simulation results has been favorably tested and compared with practical measurements. It is found that the multi-domain simulation results and measurements data are in close agreement.

  4. DigitalHuman (DH): An Integrative Mathematical Model ofHuman Physiology

    Science.gov (United States)

    Hester, Robert L.; Summers, Richard L.; lIescu, Radu; Esters, Joyee; Coleman, Thomas G.

    2010-01-01

    Mathematical models and simulation are important tools in discovering the key causal relationships governing physiological processes and improving medical intervention when physiological complexity is a central issue. We have developed a model of integrative human physiology called DigitalHuman (DH) consisting of -5000 variables modeling human physiology describing cardiovascular, renal, respiratory, endocrine, neural and metabolic physiology. Users can view time-dependent solutions and interactively introduce perturbations by altering numerical parameters to investigate new hypotheses. The variables, parameters and quantitative relationships as well as all other model details are described in XML text files. All aspects of the model, including the mathematical equations describing the physiological processes are written in XML open source, text-readable files. Model structure is based upon empirical data of physiological responses documented within the peer-reviewed literature. The model can be used to understand proposed physiological mechanisms and physiological interactions that may not be otherwise intUitively evident. Some of the current uses of this model include the analyses of renal control of blood pressure, the central role of the liver in creating and maintaining insulin resistance, and the mechanisms causing orthostatic hypotension in astronauts. Additionally the open source aspect of the modeling environment allows any investigator to add detailed descriptions of human physiology to test new concepts. The model accurately predicts both qualitative and more importantly quantitative changes in clinically and experimentally observed responses. DigitalHuman provides scientists a modeling environment to understand the complex interactions of integrative physiology. This research was supported by.NIH HL 51971, NSF EPSCoR, and NASA

  5. An Integrated Tool for Low Thrust Optimal Control Orbit Transfers in Interplanetary Trajectories

    Science.gov (United States)

    Dargent, T.; Martinot, V.

    In the last recent years a significant progress has been made in optimal control orbit transfers using low thrust electrical propulsion for interplanetary missions. The system objective is always the same: decrease the transfer duration and increase the useful satellite mass. The optimum control strategy to perform the minimum time to orbit or the minimum fuel consumption requires the use of sophisticated mathematical tools, most of the time dedicated to a specific mission and therefore hardly reusable. To improve this situation and enable Alcatel Space to perform rather quick trajectory design as requested by mission analysis, we have developed a software tool T-3D dedicated to optimal control orbit transfers which integrates various initial and terminal rendezvous conditions - e.g. fixed arrival time for planet encounter - and engine thrust profiles -e.g. thrust law variation with respect to the distance to the Sun -. This single and quite versatile tool allows to perform analyses like minimum consumption for orbit insertions around a planet from an hyperbolic trajectory, interplanetary orbit transfers, low thrust minimum time multiple revolution orbit transfers, etc… From a mathematical point of view, the software relies on the minimum principle formulation to find the necessary conditions of optimality. The satellite dynamics is a two body model and relies of an equinoctial formulation of the Gauss equation. This choice has been made for numerical purpose and to solve more quickly the two point boundaries values problem. In order to handle the classical problem of co-state variables initialization, problems simpler than the actual one can be solved straight forward by the tool and the values of the co-state variables are kept as first guess for a more complex problem. Finally, a synthesis of the test cases is presented to illustrate the capacities of the tool, mixing examples of interplanetary mission, orbit insertion, multiple revolution orbit transfers

  6. An efficient tool for metabolic pathway construction and gene integration for Aspergillus niger.

    Science.gov (United States)

    Sarkari, Parveen; Marx, Hans; Blumhoff, Marzena L; Mattanovich, Diethard; Sauer, Michael; Steiger, Matthias G

    2017-12-01

    Metabolic engineering requires functional genetic tools for easy and quick generation of multiple pathway variants. A genetic engineering toolbox for A. niger is presented, which facilitates the generation of strains carrying heterologous expression cassettes at a defined genetic locus. The system is compatible with Golden Gate cloning, which facilitates the DNA construction process and provides high design flexibility. The integration process is mediated by a CRISPR/Cas9 strategy involving the cutting of both the genetic integration locus (pyrG) as well as the integrating plasmid. Only a transient expression of Cas9 is necessary and the carrying plasmid is readily lost using a size-reduced AMA1 variant. A high integration efficiency into the fungal genome of up to 100% can be achieved, thus reducing the screening process significantly. The feasibility of the approach was demonstrated by the integration of an expression cassette enabling the production of aconitic acid in A. niger. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Green Infrastructure Models and Tools

    Science.gov (United States)

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  8. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    OpenAIRE

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-01-01

    Background The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly whe...

  9. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  10. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  11. Wilmar Planning Tool, VBA documentation

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Helge V.

    2006-01-15

    This is a documentation of the VBA (Visual Basic for Applications) in the Wilmar Planning Tool. VBA is used in the Wilmar User Shell (an Excel workbook) and in the three Access databases that hold input, scenario and output data. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). The User Shell controls the operation of the Wilmar Planning Tool. In the User Shell various control parameters are set, and then a macro in the Input Database is run that writes input files for the Joint market Model and the Long Term Model. Afterwards these models can be started from the User Shell. Finally, the User Shell can start a macro in the Output Database that imports the output files from the models. (LN)

  12. Wilmar Planning Tool, VBA documentation

    International Nuclear Information System (INIS)

    Larsen, Helge V.

    2006-01-01

    This is a documentation of the VBA (Visual Basic for Applications) in the Wilmar Planning Tool. VBA is used in the Wilmar User Shell (an Excel workbook) and in the three Access databases that hold input, scenario and output data. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). The User Shell controls the operation of the Wilmar Planning Tool. In the User Shell various control parameters are set, and then a macro in the Input Database is run that writes input files for the Joint market Model and the Long Term Model. Afterwards these models can be started from the User Shell. Finally, the User Shell can start a macro in the Output Database that imports the output files from the models. (LN)

  13. Documentation of databases in the Wilmar Planning tool

    International Nuclear Information System (INIS)

    Kiviluioma, J.; Meimbom, P.

    2006-01-01

    The Wilmar Planning tool consists of a number of databases and models as shown in Figure 1. This report documents the design of the following subparts of the Wilmar Planning tool: 1. The Scenario database holding the scenario trees generated from the Scenario Tree Creation model. 2. The Input database holding input data to the Joint Market model and the Long-term model apart from the scenario trees. 3. The output database containing the results of a Joint Market model run. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). (LN)

  14. Enabling model customization and integration

    Science.gov (United States)

    Park, Minho; Fishwick, Paul A.

    2003-09-01

    Until fairly recently, the idea of dynamic model content and presentation were treated synonymously. For example, if one was to take a data flow network, which captures the dynamics of a target system in terms of the flow of data through nodal operators, then one would often standardize on rectangles and arrows for the model display. The increasing web emphasis on XML, however, suggests that the network model can have its content specified in an XML language, and then the model can be represented in a number of ways depending on the chosen style. We have developed a formal method, based on styles, that permits a model to be specified in XML and presented in 1D (text), 2D, and 3D. This method allows for customization and personalization to exert their benefits beyond e-commerce, to the area of model structures used in computer simulation. This customization leads naturally to solving the bigger problem of model integration - the act of taking models of a scene and integrating them with that scene so that there is only one unified modeling interface. This work focuses mostly on customization, but we address the integration issue in the future work section.

  15. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  16. Computer-based tools for decision support at the Hanford Site

    International Nuclear Information System (INIS)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ''glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission

  17. Computer-based tools for decision support at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  18. Computer-based tools for decision support at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high` level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ``glue`` or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  19. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  20. Stochastic tools in turbulence

    CERN Document Server

    Lumey, John L

    2012-01-01

    Stochastic Tools in Turbulence discusses the available mathematical tools to describe stochastic vector fields to solve problems related to these fields. The book deals with the needs of turbulence in relation to stochastic vector fields, particularly, on three-dimensional aspects, linear problems, and stochastic model building. The text describes probability distributions and densities, including Lebesgue integration, conditional probabilities, conditional expectations, statistical independence, lack of correlation. The book also explains the significance of the moments, the properties of the

  1. Advancement of the methodology for automated integration of external hazards into level 1 PSA modeling. Technical report; Weiterentwicklung der Methodik zur automatisierten Integration uebergreifender Einwirkungen in PSA-Modelle der Stufe 1. Technischer Fachbericht

    Energy Technology Data Exchange (ETDEWEB)

    Berner, Nadine; Herb, Joachim

    2017-03-15

    In the course of the research and development project RS1539 funded by the German Federal Ministry for Economics and Energy (BMWi) the methodology for the automated integration of hazards in Level 1 PSA models has been enhanced. Thereby, the analysis tool pyRiskRobot provides the methodological framework for mapping a generic spectrum of internal and external hazards onto complex PSA plant models. The reimplementation of the software tool via the programming language python extends the applicability and facilitates the handling of pyRiskRobot in comparison to the previous Ruby-based version RiskRobot. Moreover, the development of functions to perform the topological modelling of fault trees and the probabilistic specification of modified fault tree elements have been continued. Due to the reimplementation and further developments, the tool enables to systematically generate fault trees of varying complexity, to flexibly integrate fault trees in existing PSA models and to automatically duplicate interconnected topologies. Thus, pyRiskRobot allows the efficient and traceable realization of hazard specific, usually laborious modifications of PSA models. In addition, pyRiskRobot has been extended to serve as a functional interface between the data compilations comprising the potential influences of hazards on PSA relevant components and the data base of a PSA plant model. Based on this conceptual design, additional analyses of the data can be carried out prior to the integration within the PSA model topology. The reimplemented functionalities of pyRiskRobot have been validated with respect to reference applications, such as the modelling of an internal fire scenario, against the previous version RiskRobot. The existing method collection for the automated modification of fault tree topologies has been extended based on the requirements for further applications, among others the modelling of an external flooding scenario. The deduced hazard specific modelling approaches

  2. MEETING IN CHICAGO: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND ENVIRONMENTAL RISK ASSESSMENT

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  3. MEETING IN CZECH REPUBLIC: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND RISK ASSESSMENT

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  4. An empirical modeling tool and glass property database in development of US-DOE radioactive waste glasses

    International Nuclear Information System (INIS)

    Muller, I.; Gan, H.

    1997-01-01

    An integrated glass database has been developed at the Vitreous State Laboratory of Catholic University of America. The major objective of this tool was to support glass formulation using the MAWS approach (Minimum Additives Waste Stabilization). An empirical modeling capability, based on the properties of over 1000 glasses in the database, was also developed to help formulate glasses from waste streams under multiple user-imposed constraints. The use of this modeling capability, the performance of resulting models in predicting properties of waste glasses, and the correlation of simple structural theories to glass properties are the subjects of this paper. (authors)

  5. An integrated Biophysical CGE model to provide Sustainable Development Goal insights

    Science.gov (United States)

    Sanchez, Marko; Cicowiez, Martin; Howells, Mark; Zepeda, Eduardo

    2016-04-01

    Future projected changes in the energy system will inevitably result in changes to the level of appropriation of environmental resources, particularly land and water, and this will have wider implications for environmental sustainability, and may affect other sectors of the economy. An integrated climate, land, energy and water (CLEW) system will provide useful insights, particularly with regard to the environmental sustainability. However, it will require adequate integration with other tools to detect economic impacts and broaden the scope for policy analysis. A computable general equilibrium (CGE) model is a well suited tool to channel impacts, as detected in a CLEW analysis, onto all sectors of the economy, and evaluate trade-offs and synergies, including those of possible policy responses. This paper will show an application of such integration in a single-country CGE model with the following key characteristics. Climate is partly exogenous (as proxied by temperature and rainfall) and partly endogenous (as proxied by emissions generated by different sectors) and has an impact on endogenous variables such as land productivity and labor productivity. Land is a factor of production used in agricultural and forestry activities which can be of various types if land use alternatives (e.g., deforestation) are to be considered. Energy is an input to the production process of all economic sectors and a consumption good for households. Because it is possible to allow for substitution among different energy sources (e.g. renewable vs non-renewable) in the generation of electricity, the production process of energy products can consider the use of natural resources such as oil and water. Water, data permitting, can be considered as an input into the production process of agricultural sectors, which is particularly relevant in case of irrigation. It can also be considered as a determinant of total factor productivity in hydro-power generation. The integration of a CLEW

  6. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  7. Application of a Groundwater Modeling Tool for Managing Hydrologically Connected Area in State of Nebraska, US

    Science.gov (United States)

    Li, R.; Flyr, B.; Bradley, J.; Pun, M.; Schneider, J.; Wietjes, J.; Chinta, S.

    2014-12-01

    Determination of the nature and degree of hydrologically connected groundwater and surface water resources is of paramount importance to integrated water management within the State of Nebraska to understand the impact of water uses on available supplies, such as depletion of streams and aquifers caused by groundwater pumping. The ability to quantify effects of surface water-groundwater hydrologic connection and interactions, is regarded as one of the most important steps towards effectively managing water resources in Nebraska and provides the basis for designating management areas. Designation of management areas allows the state and other management entities to focus various efforts and resources towards those projects that have the greatest impact to water users. Nebraska Department of Natural Resources (NDNR) developed a groundwater modeling tool, Cycle Well Analysis, to determine the areas defined to have a high degree of connectivity between groundwater and surface water (in accordance with the state regulations). This tool features two graphic user interfaces to allow the analysis to be fully compatible with most MODFLOW-based numerical groundwater models currently utilized by NDNR. Case studies showed that the tool, in combination of Geographic Information Systems (GIS), can be used to quantify the degree of stream depletion and delineate the boundary of hydrologically connected areas within different political boundaries and subbasins in Nebraska. This approach may be applied to other regions with similar background and need for integrated water management.

  8. Health literacy and public health: A systematic review and integration of definitions and models

    LENUS (Irish Health Repository)

    Sorensen, Kristine

    2012-01-25

    Abstract Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.

  9. Open source integrated modeling environment Delta Shell

    Science.gov (United States)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  10. Building a bridge into the future: dynamic connectionist modeling as an integrative tool for research on intertemporal choice.

    Science.gov (United States)

    Scherbaum, Stefan; Dshemuchadse, Maja; Goschke, Thomas

    2012-01-01

    Temporal discounting denotes the fact that individuals prefer smaller rewards delivered sooner over larger rewards delivered later, often to a higher extent than suggested by normative economical theories. In this article, we identify three lines of research studying this phenomenon which aim (i) to describe temporal discounting mathematically, (ii) to explain observed choice behavior psychologically, and (iii) to predict the influence of specific factors on intertemporal decisions. We then opt for an approach integrating postulated mechanisms and empirical findings from these three lines of research. Our approach focuses on the dynamical properties of decision processes and is based on computational modeling. We present a dynamic connectionist model of intertemporal choice focusing on the role of self-control and time framing as two central factors determining choice behavior. Results of our simulations indicate that the two influences interact with each other, and we present experimental data supporting this prediction. We conclude that computational modeling of the decision process dynamics can advance the integration of different strands of research in intertemporal choice.

  11. Building a bridge into the future: Dynamic connectionist modeling as an integrative tool for research on intertemporal choice

    Directory of Open Access Journals (Sweden)

    Stefan eScherbaum

    2012-11-01

    Full Text Available Temporal discounting denotes the fact that individuals prefer smaller rewards delivered sooner over larger rewards delivered later, often to a higher extent than suggested by normative economical theories. In this article, we identify three lines of research studying this phenomenon which aim (i to describe temporal discounting mathematically, (ii to explain observed choice behavior psychologically, and (iii to predict the influence of specific factors on intertemporal decisions. We then opt for an approach integrating postulated mechanisms and empirical findings from these three lines of research. Our approach focuses on the dynamical properties of decision processes and is based on computational modeling. We present a dynamic connectionist model of intertemporal choice focusing on the role of self-control and time framing as two central factors determining choice behavior. Results of our simulations indicate that the two influences interact with each other, and we present experimental data supporting this prediction. We conclude that computational modeling of the decision process dynamics can advance the integration of different strands of research in intertemporal choice.

  12. Stakeholder views of management and decision support tools to integrate climate change into Great Lakes Lake Whitefish management

    Science.gov (United States)

    Lynch, Abigail J.; Taylor, William W.; McCright, Aaron M.

    2016-01-01

    Decision support tools can aid decision making by systematically incorporating information, accounting for uncertainties, and facilitating evaluation between alternatives. Without user buy-in, however, decision support tools can fail to influence decision-making processes. We surveyed fishery researchers, managers, and fishers affiliated with the Lake Whitefish Coregonus clupeaformis fishery in the 1836 Treaty Waters of Lakes Huron, Michigan, and Superior to assess opinions of current and future management needs to identify barriers to, and opportunities for, developing a decision support tool based on Lake Whitefish recruitment projections with climate change. Approximately 64% of 39 respondents were satisfied with current management, and nearly 85% agreed that science was well integrated into management programs. Though decision support tools can facilitate science integration into management, respondents suggest that they face significant implementation barriers, including lack of political will to change management and perceived uncertainty in decision support outputs. Recommendations from this survey can inform development of decision support tools for fishery management in the Great Lakes and other regions.

  13. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    Science.gov (United States)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  14. Integrated modelling of ecosystem services and energy systems research

    Science.gov (United States)

    Agarwala, Matthew; Lovett, Andrew; Bateman, Ian; Day, Brett; Agnolucci, Paolo; Ziv, Guy

    2016-04-01

    The UK Government is formally committed to reducing carbon emissions and protecting and improving natural capital and the environment. However, actually delivering on these objectives requires an integrated approach to addressing two parallel challenges: de-carbonising future energy system pathways; and safeguarding natural capital to ensure the continued flow of ecosystem services. Although both emphasise benefiting from natural resources, efforts to connect natural capital and energy systems research have been limited, meaning opportunities to improve management of natural resources and meet society's energy needs could be missed. The ecosystem services paradigm provides a consistent conceptual framework that applies in multiple disciplines across the natural and economic sciences, and facilitates collaboration between them. At the forefront of the field, integrated ecosystem service - economy models have guided public- and private-sector decision making at all levels. Models vary in sophistication from simple spreadsheet tools to complex software packages integrating biophysical, GIS and economic models and draw upon many fields, including ecology, hydrology, geography, systems theory, economics and the social sciences. They also differ in their ability to value changes in natural capital and ecosystem services at various spatial and temporal scales. Despite these differences, current models share a common feature: their treatment of energy systems is superficial at best. In contrast, energy systems research has no widely adopted, unifying conceptual framework that organises thinking about key system components and interactions. Instead, the literature is organised around modelling approaches, including life cycle analyses, econometric investigations, linear programming and computable general equilibrium models. However, some consistencies do emerge. First, often contain a linear set of steps, from exploration to resource supply, fuel processing, conversion

  15. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  16. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...

  17. miRQuest: integration of tools on a Web server for microRNA research.

    Science.gov (United States)

    Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R

    2016-03-28

    This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.

  18. Solid waste integrated cost analysis model: 1991 project year report

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    The purpose of the City of Houston's 1991 Solid Waste Integrated Cost Analysis Model (SWICAM) project was to continue the development of a computerized cost analysis model. This model is to provide solid waste managers with tool to evaluate the dollar cost of real or hypothetical solid waste management choices. Those choices have become complicated by the implementation of Subtitle D of the Resources Conservation and Recovery Act (RCRA) and the EPA's Integrated Approach to managing municipal solid waste;. that is, minimize generation, maximize recycling, reduce volume (incinerate), and then bury (landfill) only the remainder. Implementation of an integrated solid waste management system involving all or some of the options of recycling, waste to energy, composting, and landfilling is extremely complicated. Factors such as hauling distances, markets, and prices for recyclable, costs and benefits of transfer stations, and material recovery facilities must all be considered. A jurisdiction must determine the cost impacts of implementing a number of various possibilities for managing, handling, processing, and disposing of waste. SWICAM employs a single Lotus 123 spreadsheet to enable a jurisdiction to predict or assess the costs of its waste management system. It allows the user to select his own process flow for waste material and to manipulate the model to include as few or as many options as he or she chooses. The model will calculate the estimated cost for those choices selected. The user can then change the model to include or exclude waste stream components, until the mix of choices suits the user. Graphs can be produced as a visual communication aid in presenting the results of the cost analysis. SWICAM also allows future cost projections to be made.

  19. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  20. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping

    2015-01-01

    Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911

  1. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  2. Algal Functional Annotation Tool: a web-based analysis suite to functionally interpret large gene lists using integrated annotation and expression data

    Directory of Open Access Journals (Sweden)

    Merchant Sabeeha S

    2011-07-01

    Full Text Available Abstract Background Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. Description The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of

  3. Integrated catchment modelling within a strategic planning and decision making process: Werra case study

    Science.gov (United States)

    Dietrich, Jörg; Funke, Markus

    Integrated water resources management (IWRM) redefines conventional water management approaches through a closer cross-linkage between environment and society. The role of public participation and socio-economic considerations becomes more important within the planning and decision making process. In this paper we address aspects of the integration of catchment models into such a process taking the implementation of the European Water Framework Directive (WFD) as an example. Within a case study situated in the Werra river basin (Central Germany), a systems analytic decision process model was developed. This model uses the semantics of the Unified Modeling Language (UML) activity model. As an example application, the catchment model SWAT and the water quality model RWQM1 were applied to simulate the effect of phosphorus emissions from non-point and point sources on water quality. The decision process model was able to guide the participants of the case study through the interdisciplinary planning and negotiation of actions. Further improvements of the integration framework include tools for quantitative uncertainty analyses, which are crucial for real life application of models within an IWRM decision making toolbox. For the case study, the multi-criteria assessment of actions indicates that the polluter pays principle can be met at larger scales (sub-catchment or river basin) without significantly compromising cost efficiency for the local situation.

  4. Wilmar Planning Tool, user guide

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Helge V.

    2006-01-15

    This is a short user guide to the Wilmar Planning Tool developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. In the User Shell various scenario variables and control parameters are set, and export of model data from the input database, activation of the models, as well as import of model results to the output database are triggered from the shell. (au)

  5. Wilmar Planning Tool, user guide

    International Nuclear Information System (INIS)

    Larsen, Helge V.

    2006-01-01

    This is a short user guide to the Wilmar Planning Tool developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. In the User Shell various scenario variables and control parameters are set, and export of model data from the input database, activation of the models, as well as import of model results to the output database are triggered from the shell. (au)

  6. An integrated Modelling framework to monitor and predict trends of agricultural management (iMSoil)

    Science.gov (United States)

    Keller, Armin; Della Peruta, Raneiro; Schaepman, Michael; Gomez, Marta; Mann, Stefan; Schulin, Rainer

    2014-05-01

    Agricultural systems lay at the interface between natural ecosystems and the anthroposphere. Various drivers induce pressures on the agricultural systems, leading to changes in farming practice. The limitation of available land and the socio-economic drivers are likely to result in further intensification of agricultural land management, with implications on fertilization practices, soil and pest management, as well as crop and livestock production. In order to steer the development into desired directions, tools are required by which the effects of these pressures on agricultural management and resulting impacts on soil functioning can be detected as early as possible, future scenarios predicted and suitable management options and policies defined. In this context, the use of integrated models can play a major role in providing long-term predictions of soil quality and assessing the sustainability of agricultural soil management. Significant progress has been made in this field over the last decades. Some of these integrated modelling frameworks include biophysical parameters, but often the inherent characteristics and detailed processes of the soil system have been very simplified. The development of such tools has been hampered in the past by a lack of spatially explicit soil and land management information at regional scale. The iMSoil project, funded by the Swiss National Science Foundation in the national research programme NRP68 "soil as a resource" (www.nrp68.ch) aims at developing and implementing an integrated modeling framework (IMF) which can overcome the limitations mentioned above, by combining socio-economic, agricultural land management, and biophysical models, in order to predict the long-term impacts of different socio-economic scenarios on the soil quality. In our presentation we briefly outline the approach that is based on an interdisciplinary modular framework that builds on already existing monitoring tools and model components that are

  7. A model library for dynamic transport and fate of micropollutants in integrated urban wastewater and stormwater systems

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Benedetti, Lorenzo; Gevaert, Veerle

    2014-01-01

    by using substance inherent properties, following an approach commonly used in large-scale MP multimedia fate and transport models. The chosen level of complexity ensures a low data requirement and minimizes the need for field measurements. Next to a synthesis of model applications, a didactic example......The increasing efforts in reducing the emission of micropollutants (MP) into the natural aquatic environment require the development of modelling tools to support the decision making process. This article presents a library of dynamic modelling tools for estimating MP fluxes within Integrated Urban...... Wastewater and Stormwater system (IUWS – including drainage network, stormwater treatment units, wastewater treatment plants, sludge treatment, and the receiving water body). The models are developed by considering the high temporal variability of the processes taking place in the IUWS, providing a basis...

  8. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    Energy Technology Data Exchange (ETDEWEB)

    Doligez, B.; Eschard, R. [Institut Francais du Petrole, Rueil Malmaison (France); Geffroy, F. [Centre de Geostatistique, Fontainebleau (France)] [and others

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  9. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  10. Spatial Modeling Tools for Cell Biology

    National Research Council Canada - National Science Library

    Przekwas, Andrzej; Friend, Tom; Teixeira, Rodrigo; Chen, Z. J; Wilkerson, Patrick

    2006-01-01

    .... Scientific potentials and military relevance of computational biology and bioinformatics have inspired DARPA/IPTO's visionary BioSPICE project to develop computational framework and modeling tools for cell biology...

  11. Potentials for the use of tool-integrated in-line data acquisition systems in press shops

    Science.gov (United States)

    Maier, S.; Schmerbeck, T.; Liebig, A.; Kautz, T.; Volk, W.

    2017-09-01

    Robust in-line data acquisition systems are required for the realization of process monitoring and control systems in press shops. A promising approach is the integration of sensors in the following press tools. There they can be easy integrated and maintained. It also achieves the necessary robustness for the rough press environment. Such concepts were already investigated for the measurement of the geometrical accuracy as well as for the material flow of inner part areas. They enable the monitoring of each produced part’s quality. An important success factor are practical approaches to the use of this new process information in press shops. This work presents various applications of these measuring concepts, based on real car body components of the BMW Group. For example, the procedure of retroactive error analysis is explained for a side frame. It also shows how this data acquisition can be used for the optimization of drawing tools in tool shops. With the skid-line, there is a continuous value that can be monitored from planning to serial production.

  12. Integration of ROOT Notebooks as an ATLAS analysis web-based tool in outreach and public data release

    CERN Document Server

    Sanchez, Arturo; The ATLAS collaboration

    2016-01-01

    The integration of the ROOT data analysis framework with the Jupyter Notebook technology presents an incredible potential in the enhance and expansion of educational and training programs: starting from university students in their early years, passing to new ATLAS PhD students and post doctoral researchers, to those senior analysers and professors that want to restart their contact with the analysis of data or to include a more friendly but yet very powerful open source tool in the classroom. Such tools have been already tested in several environments and a fully web-based integration together with Open Access Data repositories brings the possibility to go a step forward in the search of ATLAS for integration between several CERN projects in the field of the education and training, developing new computing solutions on the way.

  13. Integrated Debugging of Modelica Models

    Directory of Open Access Journals (Sweden)

    Adrian Pop

    2014-04-01

    Full Text Available The high abstraction level of equation-based object-oriented (EOO languages such as Modelica has the drawback that programming and modeling errors are often hard to find. In this paper we present integrated static and dynamic debugging methods for Modelica models and a debugger prototype that addresses several of those problems. The goal is an integrated debugging framework that combines classical debugging techniques with special techniques for equation-based languages partly based on graph visualization and interaction. To our knowledge, this is the first Modelica debugger that supports both equation-based transformational and algorithmic code debugging in an integrated fashion.

  14. Application of the NCSA Habanero tool for collaboration on structural integrity assessments

    International Nuclear Information System (INIS)

    Bass, B.R.; Kruse, K.; Dodds, R.H. Jr.; Malik, S.N.M.

    1998-11-01

    The Habanero software was developed by the National Center for Superconducting Applications at the University of Illinois, Urbana-Champaign, as a framework for the collaborative sharing of Java applications. The Habanero tool performs distributed communication of single-user, computer software interactions to a multiuser collaborative environment. An investigation was conducted to evaluate the capabilities of the Habanero tool in providing an Internet-based collaborative framework for researchers located at different sites and operating on different workstations. These collaborative sessions focused on the sharing of test data and analysis results from materials engineering areas (i.e., fracture mechanics and structural integrity evaluations) related to reactor pressure vessel safety research sponsored by the US Nuclear Regulatory Commission. This report defines collaborative-system requirements for engineering applications and provides an overview of collaborative systems within the project. The installation, application, and detailed evaluation of the performance of the Habanero collaborative tool are compared to those of another commercially available collaborative product. Recommendations are given for future work in collaborative communications

  15. Developing health science students into integrated health professionals: a practical tool for learning

    Directory of Open Access Journals (Sweden)

    Duncan Madeleine

    2007-11-01

    Full Text Available Abstract Background An integrated sense of professionalism enables health professionals to draw on relevant knowledge in context and to apply a set of professional responsibilities and ethical principles in the midst of changing work environments 12. Inculcating professionalism is therefore a critical goal of health professional education. Two multi-professional courses for first year Health Science students at the University of Cape Town, South Africa aim to lay the foundation for becoming an integrated health professional 3. In these courses a diagram depicting the domains of the integrated health professional is used to focus the content of small group experiential exercises towards an appreciation of professionalism. The diagram serves as an organising framework for conceptualising an emerging professional identity and for directing learning towards the domains of 'self as professional' 45. Objective This paper describes how a diagrammatic representation of the core elements of an integrated health professional is used as a template for framing course content and for organising student learning. Based on the assumption that all health care professionals should be knowledgeable, empathic and reflective, the diagram provides students and educators with a visual tool for investigating the subjective and objective dimensions of professionalism. The use of the diagram as an integrating point of reference for individual and small group learning is described and substantiated with relevant literature. Conclusion The authors have applied the diagram with positive impact for the past six years with students and educators reporting that "it just makes sense". The article includes plans for formal evaluation. Evaluation to date is based on preliminary, informal feedback on the value of the diagram as a tool for capturing the domains of professionalism at an early stage in the undergraduate education of health professional students.

  16. Challenges in integrative approaches to modelling the marine ecosystems of the North Atlantic: Physics to fish and coasts to ocean

    Science.gov (United States)

    Holt, Jason; Icarus Allen, J.; Anderson, Thomas R.; Brewin, Robert; Butenschön, Momme; Harle, James; Huse, Geir; Lehodey, Patrick; Lindemann, Christian; Memery, Laurent; Salihoglu, Baris; Senina, Inna; Yool, Andrew

    2014-12-01

    It has long been recognised that there are strong interactions and feedbacks between climate, upper ocean biogeochemistry and marine food webs, and also that food web structure and phytoplankton community distribution are important determinants of variability in carbon production and export from the euphotic zone. Numerical models provide a vital tool to explore these interactions, given their capability to investigate multiple connected components of the system and the sensitivity to multiple drivers, including potential future conditions. A major driver for ecosystem model development is the demand for quantitative tools to support ecosystem-based management initiatives. The purpose of this paper is to review approaches to the modelling of marine ecosystems with a focus on the North Atlantic Ocean and its adjacent shelf seas, and to highlight the challenges they face and suggest ways forward. We consider the state of the art in simulating oceans and shelf sea physics, planktonic and higher trophic level ecosystems, and look towards building an integrative approach with these existing tools. We note how the different approaches have evolved historically and that many of the previous obstacles to harmonisation may no longer be present. We illustrate this with examples from the on-going and planned modelling effort in the Integrative Modelling Work Package of the EURO-BASIN programme.

  17. Enhanced model for integrated simulation of an entrained bed gasifier implemented as Aspen Hysys extension

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Fortes, M; Bojarski, A; Ferrer-Nadal, S; Kopanos, G; Mitta, N; Pinilla, C A; Nougues, J M; Velo, E; Puigjaner, L [Universitat Politecnica de Catalunya, Barcelona (Spain). Dept. of Chemical Engineering-CEPIMA

    2007-07-01

    In this work an enhanced mathematical model of an entrained bed gasifier has been developed for improved synthesis gas production. The gasification model considers five stages: pyrolysis, volatiles combustion, char combustion, gasification and a final gas equilibrium zone. Mathematical simulations are carried out to help finding out feasible operating conditions of the process to achieve improved process performance. Visual Basic (VB) is tested as tool for modelling, by using the Aspen Hysys Extension (AHE) interface standards. This standard provides a suitable environment for this purpose, since it allows the creation of completely custom modules which are easy to plug and use thus facilitating the handling of complex models ready to interact with commercial simulation platforms. In this work, integration of different models is accomplished in Aspen Hysys (AH), which provides the basic connectivity within models components, and the thermodynamic framework needed. The integrated modules simulation environment platform uses data from ELCOGAS for validation purposes with excellent preliminary results. 9 refs., 2 figs.

  18. PumpKin: A tool to find principal pathways in plasma chemical models

    Science.gov (United States)

    Markosyan, A. H.; Luque, A.; Gordillo-Vázquez, F. J.; Ebert, U.

    2014-10-01

    PumpKin is a software package to find all principal pathways, i.e. the dominant reaction sequences, in chemical reaction systems. Although many tools are available to integrate numerically arbitrarily complex chemical reaction systems, few tools exist in order to analyze the results and interpret them in relatively simple terms. In particular, due to the large disparity in the lifetimes of the interacting components, it is often useful to group reactions into pathways that recycle the fastest species. This allows a researcher to focus on the slow chemical dynamics, eliminating the shortest timescales. Based on the algorithm described by Lehmann (2004), PumpKin automates the process of finding such pathways, allowing the user to analyze complex kinetics and to understand the consumption and production of a certain species of interest. We designed PumpKin with an emphasis on plasma chemical systems but it can also be applied to atmospheric modeling and to industrial applications such as plasma medicine and plasma-assisted combustion.

  19. Collaborative Digital Games as Mediation Tool to Foster Intercultural Integration in Primary Dutch Schools

    NARCIS (Netherlands)

    A. Paz Alencar (Amanda); T. de la Hera Conde-Pumpido (Teresa)

    2015-01-01

    textabstractIn the Netherlands, the growing presence of immigrant children in schools has fueled scholarly interest in and concerns for examining the process of integration in school environments. The use of digital games has found to be an effective tool to reinforce teaching/learning practices.

  20. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    Science.gov (United States)

    Funning, G. J.; Cockett, R.

    2012-12-01

    InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median

  1. Grid Integration of Wind Farms

    Science.gov (United States)

    Giæver Tande, John Olav

    2003-07-01

    This article gives an overview of grid integration of wind farms with respect to impact on voltage quality and power system stability. The recommended procedure for assessing the impact of wind turbines on voltage quality in distribution grids is presented. The procedure uses the power quality characteristic data of wind turbines to determine the impact on slow voltage variations, flicker, voltage dips and harmonics. The detailed assessment allows for substantially more wind power in distribution grids compared with previously used rule-of-thumb guidelines. Power system stability is a concern in conjunction with large wind farms or very weak grids. Assessment requires the use of power system simulation tools, and wind farm models for inclusion in such tools are presently being developed. A fixed-speed wind turbine model is described. The model may be considered a good starting point for development of more advanced models, hereunder the concept of variable-speed wind turbines with a doubly fed induction generator is briefly explained. The use of dynamic wind farm models as part of power system simulation tools allows for detailed studies and development of innovative grid integration techniques. It is demonstrated that the use of reactive compensation may relax the short-term voltage stability limit and allow integration of significantly more wind power, and that application of automatic generation control technology may be an efficient means to circumvent thermal transmission capacity constraints. The continuous development of analysis tools and technology for cost-effective and secure grid integration is an important aid to ensure the increasing use of wind energy. A key factor for success, however, is the communication of results and gained experience, and in this regard it is hoped that this article may contribute.

  2. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  3. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  4. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  5. Experimental investigation into effect of cutting parameters on surface integrity of hardened tool steel

    Science.gov (United States)

    Bashir, K.; Alkali, A. U.; Elmunafi, M. H. S.; Yusof, N. M.

    2018-04-01

    Recent trend in turning hardened materials have gained popularity because of its immense machinability benefits. However, several machining processes like thermal assisted machining and cryogenic machining have reveal superior machinability benefits over conventional dry turning of hardened materials. Various engineering materials have been studied. However, investigations on AISI O1 tool steel have not been widely reported. In this paper, surface finish and surface integrity dominant when hard turning AISI O1 tool steel is analysed. The study is focused on the performance of wiper coated ceramic tool with respect to surface roughness and surface integrity of hardened tool steel. Hard turned tool steel was machined at varying cutting speed of 100, 155 and 210 m/min and feed rate of 0.05, 0.125 and 0.20mm/rev. The depth of cut of 0.2mm was maintained constant throughout the machining trials. Machining was conducted using dry turning on 200E-axis CNC lathe. The experimental study revealed that the surface finish is relatively superior at higher cutting speed of 210m/min. The surface finish increases when cutting speed increases whereas surface finish is generally better at lower feed rate of 0.05mm/rev. The experimental study conducted have revealed that phenomena such as work piece vibration due to poor or improper mounting on the spindle also contributed to higher surface roughness value of 0.66Ra during turning at 0.2mm/rev. Traces of white layer was observed when viewed with optical microscope which shows evidence of cutting effects on the turned work material at feed rate of 0.2 rev/min

  6. Integrating data from the Investigational Medicinal Product Dossier/investigator's brochure. A new tool for translational integration of preclinical effects.

    Science.gov (United States)

    van Gerven, Joop; Cohen, Adam

    2018-01-30

    The first administration of a new compound in humans is an important milestone. A major source of information for the researcher is the investigator's brochure (IB). Such a document, has a size of several hundred pages. The IB should enable investigators or regulators to independently assess the risk-benefit of the proposed trial but the size and complexity makes this difficult. This article offers a practical tool for the integration and subsequent communication of the complex information from the IB or other relevant data sources. This paper is accompanied by an accessible software tool to construct a single page colour-coded overview of preclinical and clinical data. © 2018 The British Pharmacological Society.

  7. Embedding Java Types in CPN Tools

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; Westergaard, Michael

    the modeller to call methods on Java ob jects. This paper is about how the stub code is generated, i.e., representing Java classes to Standard ML to be able to call Java code in the CPN models, and how the BRITNeY Suite framework handles the invocations of the stub code. The contribution of this paper is give......CPN Tools is a well known editor for Colored Petri nets (CPNs) that is capable of doing state space and performance analysis. The BRITNeY Suite has added yet another feature to CPN Tools for integrating CPN models with Java programs, by providing stubs accessible from the models, to allow...

  8. A framework for different levels of integration of computational models into web-based virtual patients.

    Science.gov (United States)

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the

  9. Vertical integration increases opportunities for patient flow.

    Science.gov (United States)

    Radoccia, R A; Benvenuto, J A; Blancett, L

    1991-08-01

    New sources of patients will become more and more important in the next decade as hospitals continue to feel the squeeze of a competitive marketplace. Vertical integration, a distribution tool used in other industries, will be a significant tool for health care administrators. In the following article, the authors explain the vertical integration model that shows promise for other institutions.

  10. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  11. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  12. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    Science.gov (United States)

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  13. EPR design tools. Integrated data processing tools

    International Nuclear Information System (INIS)

    Kern, R.

    1997-01-01

    In all technical areas, planning and design have been supported by electronic data processing for many years. New data processing tools had to be developed for the European Pressurized Water Reactor (EPR). The work to be performed was split between KWU and Framatome and laid down in the Basic Design contract. The entire plant was reduced to a logical data structure; the circuit diagrams and flowsheets of the systems were drafted, the central data pool was established, the outlines of building structures were defined, the layout of plant components was planned, and the electrical systems were documented. Also building construction engineering was supported by data processing. The tasks laid down in the Basic Design were completed as so-called milestones. Additional data processing tools also based on the central data pool are required for the phases following after the Basic Design phase, i.e Basic Design Optimization; Detailed Design; Management; Construction, and Commissioning. (orig.) [de

  14. Designing decision support tools for targeted N-regulation

    DEFF Research Database (Denmark)

    Christensen, Andreas Aagaard; Piil, Kristoffer; Andersen, Peter Stubkjær

    2017-01-01

    data model for land use data – the dNmark landscape model. Based on input data which is corrected and edited by workshop participants, the tool estimates the effect of potential land use scenarios on nutrient emissions. The tool was tested in 5 scenario workshops in case areas in Denmark in 2016...... in Denmark to develop and improve a functioning decision support tool for landscape scale N-management. The aim of the study is to evaluate how a decision support tool can best be designed in order to enable landscape scale strategic N-management practices. Methods: A prototype GIS-tool for capturing......, storing, editing, displaying and modelling landscape scale farming practices and associated emission consequences was developed. The tool was designed to integrate locally held knowledge with national scale datasets in live scenario situations through the implementation of a flexible, uniform and editable...

  15. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  16. An Assessment Tool to Integrate Sustainability Principles into the Global Supply Chain

    Directory of Open Access Journals (Sweden)

    María Jesús Muñoz-Torres

    2018-02-01

    Full Text Available The integration of sustainability principles into the assessment of companies along the supply chains is a growing research area. However, there is an absence of a generally accepted method to evaluate corporate sustainability performance (CSP, and the models and frameworks proposed by the literature present various important challenges to be addressed. A systematic literature review on the supply chain at the corporate level has been conducted, analyzing the main strengths and gaps in the sustainability assessment literature. Therefore, this paper aims to contribute to the development of this field by proposing an assessment framework a leading company can adopt to expand sustainability principles to the rest of the members of the supply chain. This proposal is based on best practices and integrates and shares efforts with key initiatives (for instance, the Organizational Environmental Footprint from the European Commission and United Nations Environment Programme and the Society of Environmental Toxicology and Chemistry UNEP/SETAC; moreover, it overcomes important limitations of the current sustainability tools in a supply chain context consistent with the circular economy, the Sustainable Development Goals (SDGs, planetary boundaries, and social foundation requirements. The results obtained create, on the one hand, new opportunities for academics; and, on the other hand, in further research, the use of this framework could be a means of actively engaging companies in their supply chains and of achieving the implementation of practical and comprehensive CSP assessment.

  17. LEARNING TOOLS INTEROPERABILITY – A NEW STANDARD FOR INTEGRATION OF DISTANCE LEARNING PLATFORMS

    Directory of Open Access Journals (Sweden)

    Oleksandr A. Shcherbyna

    2015-06-01

    Full Text Available For information technology in education there is always an issue of re-usage of electronic educational resources, their transferring possibility from one virtual learning environment to another. Previously, standardized sets of files were used to serve this purpose, for example, SCORM-packages. In this article the new standard Learning Tools Interoperability (LTI is reviewed, which allows users from one environment to access resources from another environment. This makes it possible to integrate them into a single distributed learning environment that is created and shared. The article gives examples of the practical use of standard LTI in Moodle learning management system using External tool and LTI provider plugins.

  18. An Integrated Modeling and Data Management Strategy for Cellulosic Biomass Production Decisions

    Energy Technology Data Exchange (ETDEWEB)

    David J. Muth Jr.; K. Mark Bryden; Joshua B. Koch

    2012-07-01

    Emerging cellulosic bioenergy markets can provide land managers with additional options for crop production decisions. Integrating dedicated bioenergy crops such as perennial grasses and short rotation woody species within the agricultural landscape can have positive impacts on several environmental processes including increased soil organic matter in degraded soils, reduced sediment loading in watersheds, lower green house gas (GHG) fluxes, and reduced nutrient loading in watersheds. Implementing this type of diverse bioenergy production system in a way that maximizes potential environmental benefits requires a dynamic integrated modeling and data management strategy. This paper presents a strategy for designing diverse bioenergy cropping systems within the existing row crop production landscape in the midwestern United States. The integrated model developed quantifies a wide range environmental processes including soil erosion from wind and water, soil organic matter changes, and soil GHG fluxes within a geospatial data management framework. This framework assembles and formats information from multiple spatial and temporal scales. The data assembled includes yield and productivity data from harvesting equipment at the 1m scale, surface topography data from LiDAR mapping at the less than 1m scale, soil data from US soil survey databases at the 10m to 100m scale, and climate data at the county scale. These models and data tools are assembled into an integrated computational environment that is used to determine sustainable removal rates for agricultural residues for bioenergy production at the sub-field scale under a wide range of land management practices. Using this integrated model, innovative management practices including cover cropping are then introduced and evaluated for their impact on bioenergy production and important environmental processes. The impacts of introducing dedicated energy crops onto high-risk landscape positions currently being manage in

  19. Process integrated modelling for steelmaking Life Cycle Inventory analysis

    International Nuclear Information System (INIS)

    Iosif, Ana-Maria; Hanrot, Francois; Ablitzer, Denis

    2008-01-01

    During recent years, strict environmental regulations have been implemented by governments for the steelmaking industry in order to reduce their environmental impact. In the frame of the ULCOS project, we have developed a new methodological framework which combines the process integrated modelling approach with Life Cycle Assessment (LCA) method in order to carry out the Life Cycle Inventory of steelmaking. In the current paper, this new concept has been applied to the sinter plant which is the most polluting steelmaking process. It has been shown that this approach is a powerful tool to make the collection of data easier, to save time and to provide reliable information concerning the environmental diagnostic of the steelmaking processes

  20. CATEGORIES AND TOOLS FOR MANAGING THE INTEGRATED PROJECTS AND PROGRAMS OF INNOVATIVE DEVELOPMENT OF GEOTRIONS IN RUSSIA

    Directory of Open Access Journals (Sweden)

    Chudin Anatoly Andreyevich

    2013-05-01

    Full Text Available "An integrated approach to managing the development of the innovative three-component systems, including "population” (social sphere, economy (industrial sphere, and territory (regional sphere, has been developed." These spheres of innovative development have the earth coordinates, are linked non-continuously but inseparably, so they can be analyzed only comprehensively. Following N.D. Matrusov [1] we will call them geotrions. The most important problems in geotrion management are considered in many researches, including [2, 3, 4]. The developed approach serves to overcome many obstacles to the effective development of the innovation process in Russia. The work identifies its basic categories, tools and technologies. The developed approach has revealed the minimum full table of the most significant parameters in the management of the innovation process in the geotrion (parameters of the order and integral parameters of the interaction between the processes, which provided a simple model and the effectiveness of management, through the use of its own power.