WorldWideScience

Sample records for integrated modeling tool

  1. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    by proposing potential subsequent design issues. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, these decisions are typically not connected to the models created during...... integration of formerly disconnected tools improves tool usability as well as decision maker productivity....

  2. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  3. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  4. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  5. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    , but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...... of formerly disconnected tools could improve tool usability as well as decision maker productivity....

  6. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  7. Extending the Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2016-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…

  8. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  9. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  10. Gsflow-py: An integrated hydrologic model development tool

    Science.gov (United States)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  11. An Integrated Simulation Tool for Modeling the Human Circulatory System

    Science.gov (United States)

    Asami, Ken'ichi; Kitamura, Tadashi

    This paper presents an integrated simulation of the circulatory system in physiological movement. The large circulatory system model includes principal organs and functional units in modules in which comprehensive physiological changes such as nerve reflexes, temperature regulation, acid/base balance, O2/CO2 balance, and exercise are simulated. A beat-by-beat heart model, in which the corresponding electrical circuit problems are solved by a numerical analytic method, enables calculation of pulsatile blood flow to the major organs. The integration of different perspectives on physiological changes makes this simulation model applicable for the microscopic evaluation of blood flow under various conditions in the human body.

  12. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  13. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    Science.gov (United States)

    2012-08-01

    Carin Duke University Douglas Oldenburg University of British Columbia Stephen Billings Leonard Pasion Laurens Beran Sky Research...data processing for UXO discrimination is the time (or frequency) dependent dipole model (Bell and Barrow (2001), Pasion and Oldenburg (2001), Zhang...described by a bimodal distribution (i.e. two Gaussians, see Pasion (2007)). Data features are nonetheless useful when data quality is not sufficient

  14. A Prospective Validation Study of a Rainbow Model of Integrated Care Measurement Tool in Singapore.

    Science.gov (United States)

    Nurjono, Milawaty; Valentijn, Pim P; Bautista, Mary Ann C; Wei, Lim Yee; Vrijhoef, Hubertus Johannes Maria

    2016-04-08

    The conceptual ambiguity of the integrated care concept precludes a full understanding of what constitutes a well-integrated health system, posing a significant challenge in measuring the level of integrated care. Most available measures have been developed from a disease-specific perspective and only measure certain aspects of integrated care. Based on the Rainbow Model of Integrated Care, which provides a detailed description of the complex concept of integrated care, a measurement tool has been developed to assess integrated care within a care system as a whole gathered from healthcare providers' and managerial perspectives. This paper describes the methodology of a study seeking to validate the Rainbow Model of Integrated Care measurement tool within and across the Singapore Regional Health System. The Singapore Regional Health System is a recent national strategy developed to provide a better-integrated health system to deliver seamless and person-focused care to patients through a network of providers within a specified geographical region. The validation process includes the assessment of the content of the measure and its psychometric properties. If the measure is deemed to be valid, the study will provide the first opportunity to measure integrated care within Singapore Regional Health System with the results allowing insights in making recommendations for improving the Regional Health System and supporting international comparison.

  15. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  16. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    Science.gov (United States)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  17. Regional Sediment Management (RSM) Modeling Tools: Integration of Advanced Sediment Transport Tools into HEC-RAS

    Science.gov (United States)

    2014-06-01

    sediment transport within the USACE HEC River Analysis System ( HEC - RAS ) software package and to determine its applicability to Regional Sediment...Management (RSM) challenges. HEC - RAS SEDIMENT MODELING BACKGROUND: HEC - RAS performs (1) one- dimensional (1D) steady and unsteady hydraulic river ...Albuquerque (SPA)), and recently, the USACE RSM Program. HEC - RAS is one of several hydraulic modeling codes available for river analysis in the

  18. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Hahmann, Andrea N.; Nielsen, T. S.

    This poster describes the status as of April 2012 of the Public Service Obligation (PSO) funded project PSO 10464 \\Integrated Wind Power Planning Tool". The project goal is to integrate a meso scale numerical weather prediction (NWP) model with a statistical tool in order to better predict short...... term power variation from off shore wind farms, as well as to conduct forecast error assessment studies in preparation for later implementation of such a feature in an existing simulation model. The addition of a forecast error estimation feature will further increase the value of this tool, as it...

  19. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  20. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    electrification simulator; A national CLEW tool allows for the optimization of national level integrated resource use and Macro-CLEW presents the same allowing for detailed economic-biophysical interactions. Finally open Model Management Infrastructure (MoManI) is presented that allows for the rapid prototyping of new additions to, or new resource optimization tools. Collectively these tools provide insights to some fifteen of the SDGs and are made publicly available with support to governments and academic institutions.

  1. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  2. A model of integration among prediction tools: applied study to road freight transportation

    Directory of Open Access Journals (Sweden)

    Henrique Dias Blois

    Full Text Available Abstract This study has developed a scenery analysis model which has integrated decision-making tools on investments: prospective scenarios (Grumbach Method and systems dynamics (hard modeling, with the innovated multivariate analysis of experts. It was designed through analysis and simulation scenarios and showed which are the most striking events in the study object as well as highlighted the actions could redirect the future of the analyzed system. Moreover, predictions are likely to be developed through the generated scenarios. The model has been validated empirically with road freight transport data from state of Rio Grande do Sul, Brazil. The results showed that the model contributes to the analysis of investment because it identifies probabilities of events that impact on decision making, and identifies priorities for action, reducing uncertainties in the future. Moreover, it allows an interdisciplinary discussion that correlates different areas of knowledge, fundamental when you wish more consistency in creating scenarios.

  3. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Giebel, Gregor; Nielsen, T. S.

    2012-01-01

    model to be developed in collaboration with ENFOR A/S; a danish company that specialises in forecasting and optimisation for the energy sector. This integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting......This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the working title "Integrated Wind Power Planning Tool". The project commenced October 1, 2011, and the goal is to integrate a numerical weather prediction (NWP) model with purely...

  4. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    Science.gov (United States)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision

  5. The Integrated Medical Model: A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    Science.gov (United States)

    Kerstman, Eric L.; Minard, Charles; FreiredeCarvalho, Mary H.; Walton, Marlei E.; Myers, Jerry G., Jr.; Saile, Lynn G.; Lopez, Vilma; Butler, Douglas J.; Johnson-Throop, Kathy A.

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) and its use as a risk assessment and decision support tool for human space flight missions. The IMM is an integrated, quantified, evidence-based decision support tool useful to NASA crew health and mission planners. It is intended to assist in optimizing crew health, safety and mission success within the constraints of the space flight environment for in-flight operations. It uses ISS data to assist in planning for the Exploration Program and it is not intended to assist in post flight research. The IMM was used to update Probability Risk Assessment (PRA) for the purpose of updating forecasts for the conditions requiring evacuation (EVAC) or Loss of Crew Life (LOC) for the ISS. The IMM validation approach includes comparison with actual events and involves both qualitative and quantitaive approaches. The results of these comparisons are reviewed. Another use of the IMM is to optimize the medical kits taking into consideration the specific mission and the crew profile. An example of the use of the IMM to optimize the medical kits is reviewed.

  6. Basic data, computer codes and integral experiments: The tools for modelling in nuclear technology

    International Nuclear Information System (INIS)

    Sartori, E.

    2001-01-01

    When studying applications in nuclear technology we need to understand and be able to predict the behavior of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are: - The basic nuclear or chemical data; - The computer codes, and; - The integral experiments. This article describes the role each component plays in a computational scheme designed for modelling purposes. It describes also which tools have been developed and are internationally available. The role of the OECD/NEA Data Bank, the Radiation Shielding Information Computational Center (RSICC), and the IAEA Nuclear Data Section are playing in making these elements available to the community of scientists and engineers is described. (author)

  7. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-01-01

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  8. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com [Amirkabir University of Technology, PhD Student at Reservoir Engineering, Department of Petroleum Engineering (Iran, Islamic Republic of); Khamehchi, Ehsan [Amirkabir University of Technology, Faculty of Petroleum Engineering (Iran, Islamic Republic of)

    2013-12-15

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  9. The Integrated Medical Model: A Risk Assessment and Decision Support Tool for Space Flight Medical Systems

    Science.gov (United States)

    Kerstman, Eric; Minard, Charles; Saile, Lynn; deCarvalho, Mary Freire; Myers, Jerry; Walton, Marlei; Butler, Douglas; Iyengar, Sriram; Johnson-Throop, Kathy; Baumann, David

    2009-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to mission planners and medical system designers in assessing risks and designing medical systems for space flight missions. The IMM provides an evidence based approach for optimizing medical resources and minimizing risks within space flight operational constraints. The mathematical relationships among mission and crew profiles, medical condition incidence data, in-flight medical resources, potential crew functional impairments, and clinical end-states are established to determine probable mission outcomes. Stochastic computational methods are used to forecast probability distributions of crew health and medical resource utilization, as well as estimates of medical evacuation and loss of crew life. The IMM has been used in support of the International Space Station (ISS) medical kit redesign, the medical component of the ISS Probabilistic Risk Assessment, and the development of the Constellation Medical Conditions List. The IMM also will be used to refine medical requirements for the Constellation program. The IMM outputs for ISS and Constellation design reference missions will be presented to demonstrate the potential of the IMM in assessing risks, planning missions, and designing medical systems. The implementation of the IMM verification and validation plan will be reviewed. Additional planned capabilities of the IMM, including optimization techniques and the inclusion of a mission timeline, will be discussed. Given the space flight constraints of mass, volume, and crew medical training, the IMM is a valuable risk assessment and decision support tool for medical system design and mission planning.

  10. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis.

    Science.gov (United States)

    Lal, Aparna

    2016-02-02

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  11. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis

    Directory of Open Access Journals (Sweden)

    Aparna Lal

    2016-02-01

    Full Text Available Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  12. INTEGRATION OF COST MODELS AND PROCESS SIMULATION TOOLS FOR OPTIMUM COMPOSITE MANUFACTURING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Pack, Seongchan [General Motors; Wilson, Daniel [General Motors; Aitharaju, Venkat [General Motors; Kia, Hamid [General Motors; Yu, Hang [ESI, Group.; Doroudian, Mark [ESI Group

    2017-09-05

    Manufacturing cost of resin transfer molded composite parts is significantly influenced by the cycle time, which is strongly related to the time for both filling and curing of the resin in the mold. The time for filling can be optimized by various injection strategies, and by suitably reducing the length of the resin flow distance during the injection. The curing time can be reduced by the usage of faster curing resins, but it requires a high pressure injection equipment, which is capital intensive. Predictive manufacturing simulation tools that are being developed recently for composite materials are able to provide various scenarios of processing conditions virtually well in advance of manufacturing the parts. In the present study, we integrate the cost models with process simulation tools to study the influence of various parameters such as injection strategies, injection pressure, compression control to minimize high pressure injection, resin curing rate, and demold time on the manufacturing cost as affected by the annual part volume. A representative automotive component was selected for the study and the results are presented in this paper

  13. The systems integration operations/logistics model as a decision-support tool

    International Nuclear Information System (INIS)

    Miller, C.; Vogel, L.W.; Joy, D.S.

    1989-01-01

    Congress has enacted legislation specifying Yucca Mountain, Nevada, for characterization as the candidate site for the disposal of spent fuel and high-level wastes and has authorized a monitored retrievable storage (MRS) facility if one is warranted. Nevertheless, the exact configuration of the facilities making up the Federal Waste Management System (FWMS) was not specified. This has left the Office of Civilian Radioactive Waste Management (OCRWM) the responsibility for assuring the design of a safe and reliable disposal system. In order to assist in the analysis of potential configuration alternatives, operating strategies, and other factors for the FWMS and its various elements, a decision-support tool known as the systems integration operations/logistics model (SOLMOD) was developed. SOLMOD is a discrete event simulation model that emulates the movement and interaction of equipment and radioactive waste as it is processed through the FWMS - from pickup at reactor pools to emplacement. The model can be used to measure the impacts of different operating schedules and rules, system configurations, and equipment and other resource availabilities on the performance of processes comprising the FWMS and how these factors combine to determine overall system performance. SOLMOD can assist in identifying bottlenecks and can be used to assess capacity utilization of specific equipment and staff as well as overall system resilience

  14. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  15. Integrated groundwater resource management in Indus Basin using satellite gravimetry and physical modeling tools.

    Science.gov (United States)

    Iqbal, Naveed; Hossain, Faisal; Lee, Hyongki; Akhter, Gulraiz

    2017-03-01

    Reliable and frequent information on groundwater behavior and dynamics is very important for effective groundwater resource management at appropriate spatial scales. This information is rarely available in developing countries and thus poses a challenge for groundwater managers. The in situ data and groundwater modeling tools are limited in their ability to cover large domains. Remote sensing technology can now be used to continuously collect information on hydrological cycle in a cost-effective way. This study evaluates the effectiveness of a remote sensing integrated physical modeling approach for groundwater management in Indus Basin. The Gravity Recovery and Climate Experiment Satellite (GRACE)-based gravity anomalies from 2003 to 2010 were processed to generate monthly groundwater storage changes using the Variable Infiltration Capacity (VIC) hydrologic model. The groundwater storage is the key parameter of interest for groundwater resource management. The spatial and temporal patterns in groundwater storage (GWS) are useful for devising the appropriate groundwater management strategies. GRACE-estimated GWS information with large-scale coverage is valuable for basin-scale monitoring and decision making. This frequently available information is found useful for the identification of groundwater recharge areas, groundwater storage depletion, and pinpointing of the areas where groundwater sustainability is at risk. The GWS anomalies were found to favorably agree with groundwater model simulations from Visual MODFLOW and in situ data. Mostly, a moderate to severe GWS depletion is observed causing a vulnerable situation to the sustainability of this groundwater resource. For the sustainable groundwater management, the region needs to implement groundwater policies and adopt water conservation techniques.

  16. Examination of the low frequency limit for helicopter noise data in the Federal Aviation Administration's Aviation Environmental Design Tool and Integrated Noise Model

    Science.gov (United States)

    2010-04-19

    The Federal Aviation Administration (FAA) aircraft noise modeling tools Aviation Environmental Design Tool (AEDTc) and Integrated Noise Model (INM) do not currently consider noise below 50 Hz in their computations. This paper describes a preliminary ...

  17. Digital Aquifer - Integrating modeling, technical, software and policy aspects to develop a groundwater management tool

    Science.gov (United States)

    Tirupathi, S.; McKenna, S. A.; Fleming, K.; Wambua, M.; Waweru, P.; Ondula, E.

    2016-12-01

    Groundwater management has traditionally been observed as a study for long term policy measures to ensure that the water resource is sustainable. IBM Research, in association with the World Bank, extended this traditional analysis to include realtime groundwater management by building a context-aware, water rights management and permitting system. As part of this effort, one of the primary objectives was to develop a groundwater flow model that can help the policy makers with a visual overview of the current groundwater distribution. In addition, the system helps the policy makers simulate a range of scenarios and check the sustainability of the groundwater resource in a given region. The system also enables a license provider to check the effect of the introduction of a new well on the existing wells in the domain as well as the groundwater resource in general. This process simplifies how an engineer will determine if a new well should be approved. Distance to the nearest well neighbors and the maximum decreases in water levels of nearby wells are continually assessed and presented as evidence for an engineer to make the final judgment on approving the permit. The system also facilitates updated insights on the amount of groundwater left in an area and provides advice on how water fees should be structured to balance conservation and economic development goals. In this talk, we will discuss the concept of Digital Aquifer, the challenges in integrating modeling, technical and software aspects to develop a management system that helps policy makers and license providers with a robust decision making tool. We will concentrate on the groundwater model developed using the analytic element method that plays a very important role in the decision making aspects. Finally, the efficiency of this system and methodology is shown through a case study in Laguna Province, Philippines, which was done in collaboration with the National Water Resource Board, Philippines and World

  18. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  19. Integrating the strengths of cognitive emotion models with traditional HCI analysis tools

    OpenAIRE

    Springett, Mark; Law, Effie Lai-Chong; Coulson, Mark

    2015-01-01

    This paper reports an attempt to integrate key concepts from cognitive models of emotion to cognitive models of interaction established in HCI literature. The aim is to transfer the strengths of interaction models to analysis of affect-critical systems in games, e-commerce and education, thereby increasing their usefulness in these systems where affect is increasingly recognised as a key success factor. Concepts from Scherer’s appraisal model and stimulation evaluation checks, along with a fr...

  20. Optimal Vehicle Design Using the Integrated System and Cost Modeling Tool Suite

    Science.gov (United States)

    2010-08-01

    Space Vehicle Costing ( ACEIT ) • New Small Sat Model Development & Production Cost O&M Cost Module  Radiation Exposure  Radiation Detector Response...Reliability OML Availability Risk l l Tools CEA, SRM Model, POST, ACEIT , Inflation Model, Rotor Blade Des, Microsoft Project, ATSV, S/1-iABP...space STK, SOAP – Specific mission • Space Vehicle Design (SMAD) • Space Vehicle Propulsion • Orbit Propagation • Space Vehicle Costing ( ACEIT ) • New

  1. A Practical Review of Integrated Urban Water Models: Applications as Decision Support Tools and Beyond

    Science.gov (United States)

    Mosleh, L.; Negahban-Azar, M.

    2017-12-01

    The integrated urban water management has become a necessity due to the high rate of urbanization, water scarcity, and climate variability. Climate and demographic changes, shifting the social attitude toward the water usage, and insufficiencies in system resilience increase the pressure on the water resources. Alongside with the water management, modeling urban water systems have progressed from traditional view to comprise alternatives such as decentralized water and wastewater systems, fit-for-purpose practice, graywater/rainwater reuse, and green infrastructure. While there are review papers available focusing on the technical part of the models, they seem to be more beneficial for model developers. Some of the models analyze a number of scenarios considering factors such as climate change and demography and their future impacts. However, others only focus on quality and quantity of water in a supply/demand approach. For example, optimizing the size of water or waste water store, characterizing the supply and quantity of urban stormwater and waste water, and link source of water to demand. A detailed and practical comparison of such models has become a necessity for the practitioner and policy makers. This research compares more than 7 most commonly used integrated urban water cycle models and critically reviews their capabilities, input requirements, output and their applications. The output of such detailed comparison will help the policy makers for the decision process in the built environment to compare and choose the best models that meet their goals. The results of this research show that we need a transition from developing/using integrated water cycle models to integrated system models which incorporate urban water infrastructures and ecological and economic factors. Such models can help decision makers to reflect other important criteria but with the focus on urban water management. The research also showed that there is a need in exploring

  2. Activity-Centred Tool Integration

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius

    2003-01-01

    This paper is concerned with integration of heterogeneous tools for system development. We argue that such tools should support concrete activities (e.g., programming, unit testing, conducting workshops) in contrast to abstract concerns (e.g., analysis, design, implementation). A consequence of t...... of this is that tools — or components —that support activities well should be integrated in ad-hoc, dynamic, and heterogeneous ways. We present a peer-to-peer architecture for this based on type-based publish subscribe and give an example of its use....

  3. The Overture Initiative Integrating Tools for VDM

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Battle, Nick; Ferreira, Miguel

    2010-01-01

    Overture is a community-based initiative that aims to develop a common open-source platform integrating a range of tools for constructing and analysing formal models of systems using VDM. The mission is to both provide an industrial-strength tool set for VDM and also to provide an environment...

  4. Advantages of integration of uranium exploration data in GIS and models as tools for decision support

    International Nuclear Information System (INIS)

    Tusveld, M.C.L.

    1997-01-01

    In many areas where uranium has been or is explored, an enormous amount of data on geology and hydrogeology is available. When these uranium exploration data are stored in a structured way, they can be made useful for other purposes dm uranium exploration only. For instance, in case of environmental pollution, which is often a side-effect of uranium activities such as mining and leaching, the data can be used to develop a computer model of the environment. With such a model impacts can be calculated of different scenarios for cleaning up or isolation of the pollution. A GIS can be used to store the data, to visualize the data (map production) and to analyse the data, but also to calculate input for the models. The advantages of using GIS and models as tools for decision support are explained with the Contaminant Transport Information System (CTIS) as a case study. The CTIS has been developed for remediation operations in the uranium mining area Straz pod Ralskem and Hamr in the Czech Republic. The CTIS consists of a GIS database, a regional groundwater flow model and a local contaminant transport model as well as interfaces for data transfer between the components of the information system. The power of the CTIS lies in the fact that the modelling necessary for the design of a remediation operation can be carried out efficiently by using one of the two models, depending on the specific question. Thus alternative remediation scenarios can be judged easily and fairly on their consequences and effectiveness. (author)

  5. Validation of TGLF in C-Mod and DIII-D using machine learning and integrated modeling tools

    Science.gov (United States)

    Rodriguez-Fernandez, P.; White, Ae; Cao, Nm; Creely, Aj; Greenwald, Mj; Grierson, Ba; Howard, Nt; Meneghini, O.; Petty, Cc; Rice, Je; Sciortino, F.; Yuan, X.

    2017-10-01

    Predictive models for steady-state and perturbative transport are necessary to support burning plasma operations. A combination of machine learning algorithms and integrated modeling tools is used to validate TGLF in C-Mod and DIII-D. First, a new code suite, VITALS, is used to compare SAT1 and SAT0 models in C-Mod. VITALS exploits machine learning and optimization algorithms for the validation of transport codes. Unlike SAT0, the SAT1 saturation rule contains a model to capture cross-scale turbulence coupling. Results show that SAT1 agrees better with experiments, further confirming that multi-scale effects are needed to model heat transport in C-Mod L-modes. VITALS will next be used to analyze past data from DIII-D: L-mode ``Shortfall'' plasma and ECH swing experiments. A second code suite, PRIMA, allows for integrated modeling of the plasma response to Laser Blow-Off cold pulses. Preliminary results show that SAT1 qualitatively reproduces the propagation of cold pulses after LBO injections and SAT0 does not, indicating that cross-scale coupling effects play a role in the plasma response. PRIMA will be used to ``predict-first'' cold pulse experiments using the new LBO system at DIII-D, and analyze existing ECH heat pulse data. Work supported by DE-FC02-99ER54512, DE-FC02-04ER54698.

  6. VLM Tool for IDS Integration

    Directory of Open Access Journals (Sweden)

    Cǎtǎlin NAE

    2010-03-01

    Full Text Available This paper is dedicated to a very specific type of analysis tool (VLM - Vortex Lattice Method to be integrated in a IDS - Integrated Design System, tailored for the usage of small aircraft industry. The major interest is to have the possibility to simulate at very low computational costs a preliminary set of aerodynamic characteristics for basic aerodynamic global characteristics (Lift, Drag, Pitching Moment and aerodynamic derivatives for longitudinal and lateral-directional stability analysis. This work enables fast investigations of the influence of configuration changes in a very efficient computational environment. Using experimental data and/or CFD information for a specific calibration of VLM method, reliability of the analysis may me increased so that a first type (iteration zero aerodynamic evaluation of the preliminary 3D configuration is possible. The output of this tool is basic state aerodynamic and associated stability and control derivatives, as well as a complete set of information on specific loads on major airframe components.The major interest in using and validating this type of methods is coming from the possibility to integrate it as a tool in an IDS system for conceptual design phase, as considered for development for CESAR project (IP, UE FP6.

  7. CaliBayes and BASIS: integrated tools for the calibration, simulation and storage of biological simulation models.

    Science.gov (United States)

    Chen, Yuhui; Lawless, Conor; Gillespie, Colin S; Wu, Jake; Boys, Richard J; Wilkinson, Darren J

    2010-05-01

    Dynamic simulation modelling of complex biological processes forms the backbone of systems biology. Discrete stochastic models are particularly appropriate for describing sub-cellular molecular interactions, especially when critical molecular species are thought to be present at low copy-numbers. For example, these stochastic effects play an important role in models of human ageing, where ageing results from the long-term accumulation of random damage at various biological scales. Unfortunately, realistic stochastic simulation of discrete biological processes is highly computationally intensive, requiring specialist hardware, and can benefit greatly from parallel and distributed approaches to computation and analysis. For these reasons, we have developed the BASIS system for the simulation and storage of stochastic SBML models together with associated simulation results. This system is exposed as a set of web services to allow users to incorporate its simulation tools into their workflows. Parameter inference for stochastic models is also difficult and computationally expensive. The CaliBayes system provides a set of web services (together with an R package for consuming these and formatting data) which addresses this problem for SBML models. It uses a sequential Bayesian MCMC method, which is powerful and flexible, providing very rich information. However this approach is exceptionally computationally intensive and requires the use of a carefully designed architecture. Again, these tools are exposed as web services to allow users to take advantage of this system. In this article, we describe these two systems and demonstrate their integrated use with an example workflow to estimate the parameters of a simple model of Saccharomyces cerevisiae growth on agar plates.

  8. Resource Planning Model: An Integrated Resource Planning and Dispatch Tool for Regional Electric Systems

    Energy Technology Data Exchange (ETDEWEB)

    Mai, T.; Drury, E.; Eurek, K.; Bodington, N.; Lopez, A.; Perry, A.

    2013-01-01

    This report introduces a new capacity expansion model, the Resource Planning Model (RPM), with high spatial and temporal resolution that can be used for mid- and long-term scenario planning of regional power systems. Although RPM can be adapted to any geographic region, the report describes an initial version of the model adapted for the power system in Colorado. It presents examples of scenario results from the first version of the model, including an example of a 30%-by-2020 renewable electricity penetration scenario.

  9. Modelling tools for integrating geological, geophysical and contamination data for characterization of groundwater plumes

    DEFF Research Database (Denmark)

    Balbarini, Nicola

    the contaminant plume in a shallow and a deep plume. These plumes have different chemical characteristics and different migration paths to the stream. This has implications for the risk assessment of the stream and groundwater in the area. The difficulty of determining groundwater flow paths means that it is also...... receptors, including streams. Key risk assessment parameters, such as contaminant mass discharge estimates, and tools are then used to evaluate the risk. The cost of drilling often makes investigations of large and/or deep contaminant plumes unfeasible. For this reason, it is important to develop cost...... organic compounds, including pharmaceutical compounds and chlorinated ethenes. The correlation between DCIP and organic compounds is indirect and depends on the chemical composition of the contaminant plume and the transport processes. Thus, the correlations are site specific and may change between...

  10. Manufacturing scheduling systems an integrated view on models, methods and tools

    CERN Document Server

    Framinan, Jose M; Ruiz García, Rubén

    2014-01-01

    The book is devoted to the problem of manufacturing scheduling, which is the efficient allocation of jobs (orders) over machines (resources) in a manufacturing facility. It offers a comprehensive and integrated perspective on the different aspects required to design and implement systems to efficiently and effectively support manufacturing scheduling decisions. Obtaining economic and reliable schedules constitutes the core of excellence in customer service and efficiency in manufacturing operations. Therefore, scheduling forms an area of vital importance for competition in manufacturing companies. However, only a fraction of scheduling research has been translated into practice, due to several reasons. First, the inherent complexity of scheduling has led to an excessively fragmented field in which different sub problems and issues are treated in an independent manner as goals themselves, therefore lacking a unifying view of the scheduling problem. Furthermore, mathematical brilliance and elegance has sometime...

  11. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  12. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  13. EPR design tools. Integrated data processing tools

    International Nuclear Information System (INIS)

    Kern, R.

    1997-01-01

    In all technical areas, planning and design have been supported by electronic data processing for many years. New data processing tools had to be developed for the European Pressurized Water Reactor (EPR). The work to be performed was split between KWU and Framatome and laid down in the Basic Design contract. The entire plant was reduced to a logical data structure; the circuit diagrams and flowsheets of the systems were drafted, the central data pool was established, the outlines of building structures were defined, the layout of plant components was planned, and the electrical systems were documented. Also building construction engineering was supported by data processing. The tasks laid down in the Basic Design were completed as so-called milestones. Additional data processing tools also based on the central data pool are required for the phases following after the Basic Design phase, i.e Basic Design Optimization; Detailed Design; Management; Construction, and Commissioning. (orig.) [de

  14. IDMT, Integrated Decommissioning Management Tools

    International Nuclear Information System (INIS)

    Alemberti, A.; Castagna, P.; Marsiletti, M.; Orlandi, S.; Perasso, L.; Susco, M.

    2005-01-01

    Nuclear Power Plant decommissioning requires a number of demolition activities related to civil works and systems as well as the construction of temporary facilities used for treatment and conditioning of the dismantled parts. The presence of a radiological, potentially hazardous, environment due to the specific configuration and history of the plant require a professional, expert and qualified approach approved by the national safety authority. Dismantling activities must be designed, planned and analysed in detail during an evaluation phase taking into account different scenarios generated by possible dismantling sequences and specific waste treatments to be implemented. The optimisation process of the activities becomes very challenging taking into account the requirement of the minimisation of the radiological impact on exposed workers and people during normal and accident conditions. While remote operated equipment, waste treatment and conditioning facilities may be designed taking into account this primary goal also a centralised management system and corresponding software tools have to be designed and operated in order to guarantee the fulfilment of the imposed limits as well as the traceability of wastes. Ansaldo Nuclear Division has been strongly involved in the development of a qualified and certified software environment to manage the most critical activities of a decommissioning project. The IDMT system (Integrated Decommissioning Management Tools) provide a set of stand alone user friendly applications able to work in an integrated configuration to guarantee waste identification, traceability during treatment and conditioning process as well as location and identification at the Final Repository site. Additionally, the system can be used to identify, analyse and compare different specific operating scenarios to be optimised in term of both economical and radiological considerations. The paper provides an overview of the different phases of

  15. MEETING IN CHICAGO: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND ENVIRONMENTAL RISK ASSESSMENT

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  16. SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING AND RISK ASSESSMENT (SLIDE PRESENTATION)

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  17. MEETING IN CZECH REPUBLIC: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND RISK ASSESSMENT

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  18. Development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool: An Evidence-Based Model for School Garden Integration.

    Science.gov (United States)

    Burt, Kate Gardner; Koch, Pamela; Contento, Isobel

    2017-10-01

    Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to

  19. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  20. Design Tools for Integrated Asynchronous Electronic Circuits

    National Research Council Canada - National Science Library

    Martin, Alain

    2003-01-01

    ..., simulation, verification, at the logical and physical levels. Situs has developed a business model for the commercialization of the CAD tools, and has designed the prototype of the tool suite based on this business model and the Caltech approach...

  1. Analytic tools for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Vladimir A.

    2012-01-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  2. Analytic tools for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Vladimir A. [Moscow State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2012-07-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  3. Analytic Tools for Feynman Integrals

    CERN Document Server

    Smirnov, Vladimir A

    2012-01-01

    The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice.  This book supersedes the author’s previous Springer book “Evaluating Feynman Integrals” and its textbook version “Feynman Integral Calculus.” Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added:  One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, “Applied Asymptotic Expansions in Momenta and Masses,” by the author. This chapter describes, on t...

  4. Evaluation of Oracle Big Data Integration Tools

    OpenAIRE

    Urhan, Harun; Baranowski, Zbigniew

    2015-01-01

    Abstract The project’s objective is evaluating Oracle’s Big Data Integration Tools. The project covers evaluation of two of Oracle’s tools, Oracle Data Integrator: Application Adapters for Hadoop to load data from Oracle Database to Hadoop and Oracle SQL Connectors for HDFS to query data stored on a Hadoop file system by using SQL statements executed on an Oracle Database.

  5. SIRSALE: integrated video database management tools

    Science.gov (United States)

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  6. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    Science.gov (United States)

    Chen, J.; Wu, Y.

    2012-01-01

    This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  7. Biological data integration: wrapping data and tools.

    Science.gov (United States)

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  8. Building a bridge into the future: dynamic connectionist modeling as an integrative tool for research on intertemporal choice.

    Science.gov (United States)

    Scherbaum, Stefan; Dshemuchadse, Maja; Goschke, Thomas

    2012-01-01

    Temporal discounting denotes the fact that individuals prefer smaller rewards delivered sooner over larger rewards delivered later, often to a higher extent than suggested by normative economical theories. In this article, we identify three lines of research studying this phenomenon which aim (i) to describe temporal discounting mathematically, (ii) to explain observed choice behavior psychologically, and (iii) to predict the influence of specific factors on intertemporal decisions. We then opt for an approach integrating postulated mechanisms and empirical findings from these three lines of research. Our approach focuses on the dynamical properties of decision processes and is based on computational modeling. We present a dynamic connectionist model of intertemporal choice focusing on the role of self-control and time framing as two central factors determining choice behavior. Results of our simulations indicate that the two influences interact with each other, and we present experimental data supporting this prediction. We conclude that computational modeling of the decision process dynamics can advance the integration of different strands of research in intertemporal choice.

  9. Building a bridge into the future: Dynamic connectionist modeling as an integrative tool for research on intertemporal choice

    Directory of Open Access Journals (Sweden)

    Stefan eScherbaum

    2012-11-01

    Full Text Available Temporal discounting denotes the fact that individuals prefer smaller rewards delivered sooner over larger rewards delivered later, often to a higher extent than suggested by normative economical theories. In this article, we identify three lines of research studying this phenomenon which aim (i to describe temporal discounting mathematically, (ii to explain observed choice behavior psychologically, and (iii to predict the influence of specific factors on intertemporal decisions. We then opt for an approach integrating postulated mechanisms and empirical findings from these three lines of research. Our approach focuses on the dynamical properties of decision processes and is based on computational modeling. We present a dynamic connectionist model of intertemporal choice focusing on the role of self-control and time framing as two central factors determining choice behavior. Results of our simulations indicate that the two influences interact with each other, and we present experimental data supporting this prediction. We conclude that computational modeling of the decision process dynamics can advance the integration of different strands of research in intertemporal choice.

  10. A combined reaction class approach with integrated molecular orbital+molecular orbital (IMOMO) methodology: A practical tool for kinetic modeling

    International Nuclear Information System (INIS)

    Truong, Thanh N.; Maity, Dilip K.; Truong, Thanh-Thai T.

    2000-01-01

    We present a new practical computational methodology for predicting thermal rate constants of reactions involving large molecules or a large number of elementary reactions in the same class. This methodology combines the integrated molecular orbital+molecular orbital (IMOMO) approach with our recently proposed reaction class models for tunneling. With the new methodology, we show that it is possible to significantly reduce the computational cost by several orders of magnitude while compromising the accuracy in the predicted rate constants by less than 40% over a wide range of temperatures. Another important result is that the computational cost increases only slightly as the system size increases. (c) 2000 American Institute of Physics

  11. GAPIT: genome association and prediction integrated tool.

    Science.gov (United States)

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  12. Integrated Monitoring and Modeling of Carbon Dioxide Leakage Risk Using Remote Sensing, Ground-Based Monitoring, Atmospheric Models and Risk-Indexing Tools

    Science.gov (United States)

    Burton, E. A.; Pickles, W. L.; Gouveia, F. J.; Bogen, K. T.; Rau, G. H.; Friedmann, J.

    2006-12-01

    estimating its associated risk, spatially and temporally. This requires integration of subsurface, surface and atmospheric data and models. To date, we have developed techniques to map risk based on predicted atmospheric plumes and GIS/MT (meteorologic- topographic) risk-indexing tools. This methodology was derived from study of large CO2 releases from an abandoned well penetrating a natural CO2 reservoir at Crystal Geyser, Utah. This integrated approach will provide a powerful tool to screen for high-risk zones at proposed sequestration sites, to design and optimize surface networks for site monitoring and/or to guide setting science-based regulatory compliance requirements for monitoring sequestration sites, as well as to target critical areas for first responders should a catastrophic-release event occur. This work was performed under the auspices of the U.S. Dept. of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  13. Integrated Data Visualization and Virtual Reality Tool

    Science.gov (United States)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  14. A Spectral Unmixing Model for the Integration of Multi-Sensor Imagery: A Tool to Generate Consistent Time Series Data

    Directory of Open Access Journals (Sweden)

    Georgia Doxani

    2015-10-01

    Full Text Available The Sentinel missions have been designed to support the operational services of the Copernicus program, ensuring long-term availability of data for a wide range of spectral, spatial and temporal resolutions. In particular, Sentinel-2 (S-2 data with improved high spatial resolution and higher revisit frequency (five days with the pair of satellites in operation will play a fundamental role in recording land cover types and monitoring land cover changes at regular intervals. Nevertheless, cloud coverage usually hinders the time series availability and consequently the continuous land surface monitoring. In an attempt to alleviate this limitation, the synergistic use of instruments with different features is investigated, aiming at the future synergy of the S-2 MultiSpectral Instrument (MSI and Sentinel-3 (S-3 Ocean and Land Colour Instrument (OLCI. To that end, an unmixing model is proposed with the intention of integrating the benefits of the two Sentinel missions, when both in orbit, in one composite image. The main goal is to fill the data gaps in the S-2 record, based on the more frequent information of the S-3 time series. The proposed fusion model has been applied on MODIS (MOD09GA L2G and SPOT4 (Take 5 data and the experimental results have demonstrated that the approach has high potential. However, the different acquisition characteristics of the sensors, i.e. illumination and viewing geometry, should be taken into consideration and bidirectional effects correction has to be performed in order to reduce noise in the reflectance time series.

  15. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  16. Integrated Environmental Assessment Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Guardanz, R; Gimeno, B S; Bermejo, V; Elvira, S; Martin, F; Palacios, M; Rodriguez, E; Donaire, I [Ciemat, Madrid (Spain)

    2000-07-01

    This report describes the results of the Spanish participation in the project Coupling CORINAIR data to cost-effect emission reduction strategies based on critical threshold. (EU/LIFE97/ENV/FIN/336). The subproject has focused on three tasks. Develop tools to improve knowledge on the spatial and temporal details of emissions of air pollutants in Spain. Exploit existing experimental information on plant response to air pollutants in temperate ecosystem and Integrate these findings in a modelling framework that can asses with more accuracy the impact of air pollutants to temperate ecosystems. The results obtained during the execution of this project have significantly improved the models of the impact of alternative emission control strategies on ecosystems and crops in the Iberian Peninsula. (Author) 375 refs.

  17. Computational Design Tools for Integrated Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    In an architectural conceptual sketching process, where an architect is working with the initial ideas for a design, the process is characterized by three phases: sketching, evaluation and modification. Basically the architect needs to address three areas in the conceptual sketching phase......: aesthetical, functional and technical requirements. The aim of the present paper is to address the problem of a vague or not existing link between digital conceptual design tools used by architects and designers and engineering analysis and simulation tools. Based on an analysis of the architectural design...... process different digital design methods are related to tasks in an integrated design process....

  18. Modeling tools for an Integrated River-Delta-Sea system investigation: the Pan-European Research Infrastructure DANUBIUS-RI philosophy

    Science.gov (United States)

    Umgiesser, Georg; Bellafiore, Debora; De Pascalis, Francesca; Icke, Joost; Stanica, Adrian

    2017-04-01

    The DANUBIUS Research Infrastructure (DANUBIUS-RI) is a new initiative to address the challenges and opportunities of research on large river- sea (RS) systems. DANUBIUS-RI is a distributed pan-European RI that will provide a platform for interdisciplinary research. It will deal with RS investigation through facilities and expertise from a large number of European institutions becoming a 'one-stop shop' for knowledge exchange in managing RS systems, ranging from freshwater to marine research. Globally, RS systems are complex and dynamic, with huge environmental, social and economic value. They are poorly understood but under increasing pressure through pollution, hydraulic engineering, water supply, energy, flood control and erosion. RS systems in Europe are among the most impacted globally, after centuries of industrialisation, urbanisation and agricultural intensification. Improved understanding is essential to avoid irreversible degradation and for restoration. DANUBIUS-RI will provide, among a number of other facilities concerning observations, analyses, impacts' evaluation, a modeling node that will provide integrated up-to-date tools, at locations of high scientific importance and opportunity, covering the RS systems - from source (upper parts of rivers - mountain lakes) to the transition with coastal seas. Modeling will be one of the major services provided by DANUBIUS-RI, relying on the inputs from the whole RI. RS systems are challenging from a modelling point of view, because of the complex morphology and the wide temporal and spatial range of processes occurring. Scale interaction plays a central role, considering the different hydro-eco-morphological processes on the large (basin) and small (local, coast, rivers, lagoons) scale. Currently, different model applications are made for the different geographical domains, and also for subsets of the processes. For instance there are separate models for rainfall runoff in the catchment, a sewer model for the

  19. An architecture for integration of multidisciplinary models

    DEFF Research Database (Denmark)

    Belete, Getachew F.; Voinov, Alexey; Holst, Niels

    2014-01-01

    Integrating multidisciplinary models requires linking models: that may operate at different temporal and spatial scales; developed using different methodologies, tools and techniques; different levels of complexity; calibrated for different ranges of inputs and outputs, etc. On the other hand......, Enterprise Application Integration, and Integration Design Patterns. We developed an architecture of a multidisciplinary model integration framework that brings these three aspects of integration together. Service-oriented-based platform independent architecture that enables to establish loosely coupled...

  20. A parameter optimization tool for evaluating the physical consistency of the plot-scale water budget of the integrated eco-hydrological model GEOtop in complex terrain

    Science.gov (United States)

    Bertoldi, Giacomo; Cordano, Emanuele; Brenner, Johannes; Senoner, Samuel; Della Chiesa, Stefano; Niedrist, Georg

    2017-04-01

    In mountain regions, the plot- and catchment-scale water and energy budgets are controlled by a complex interplay of different abiotic (i.e. topography, geology, climate) and biotic (i.e. vegetation, land management) controlling factors. When integrated, physically-based eco-hydrological models are used in mountain areas, there are a large number of parameters, topographic and boundary conditions that need to be chosen. However, data on soil and land-cover properties are relatively scarce and do not reflect the strong variability at the local scale. For this reason, tools for uncertainty quantification and optimal parameters identification are essential not only to improve model performances, but also to identify most relevant parameters to be measured in the field and to evaluate the impact of different assumptions for topographic and boundary conditions (surface, lateral and subsurface water and energy fluxes), which are usually unknown. In this contribution, we present the results of a sensitivity analysis exercise for a set of 20 experimental stations located in the Italian Alps, representative of different conditions in terms of topography (elevation, slope, aspect), land use (pastures, meadows, and apple orchards), soil type and groundwater influence. Besides micrometeorological parameters, each station provides soil water content at different depths, and in three stations (one for each land cover) eddy covariance fluxes. The aims of this work are: (I) To present an approach for improving calibration of plot-scale soil moisture and evapotranspiration (ET). (II) To identify the most sensitive parameters and relevant factors controlling temporal and spatial differences among sites. (III) Identify possible model structural deficiencies or uncertainties in boundary conditions. Simulations have been performed with the GEOtop 2.0 model, which is a physically-based, fully distributed integrated eco-hydrological model that has been specifically designed for mountain

  1. WINS. Market Simulation Tool for Facilitating Wind Energy Integration

    Energy Technology Data Exchange (ETDEWEB)

    Shahidehpour, Mohammad [Illinois Inst. of Technology, Chicago, IL (United States)

    2012-10-30

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practices can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision

  2. Laboratory informatics tools integration strategies for drug discovery: integration of LIMS, ELN, CDS, and SDMS.

    Science.gov (United States)

    Machina, Hari K; Wild, David J

    2013-04-01

    There are technologies on the horizon that could dramatically change how informatics organizations design, develop, deliver, and support applications and data infrastructures to deliver maximum value to drug discovery organizations. Effective integration of data and laboratory informatics tools promises the ability of organizations to make better informed decisions about resource allocation during the drug discovery and development process and for more informed decisions to be made with respect to the market opportunity for compounds. We propose in this article a new integration model called ELN-centric laboratory informatics tools integration.

  3. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  4. Integrated Variable-Fidelity Tool Set For Modeling and Simulation of Aeroservothermoelasticity -Propulsion (ASTE-P) Effects For Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  5. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  6. Data Integration Tool: Permafrost Data Debugging

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Pulsifer, P. L.; Strawhacker, C.; Yarmey, L.; Basak, R.

    2017-12-01

    We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the Global Terrestrial Network-Permafrost (GTN-P). The United States National Science Foundation funded this project through the National Snow and Ice Data Center (NSIDC) with the GTN-P to improve permafrost data access and discovery. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets (https://github.com/PermaData/DIT). Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs, incrementally interact with and evolve the widget workflows, and save those workflows for reproducibility. Taking ideas from visual programming found in the art and design domain, debugging and iterative design principles from software engineering, and the scientific data processing and analysis power of Fortran and Python it was written for interactive, iterative data manipulation, quality control, processing, and analysis of inconsistent data in an easily installable application. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.

  7. Separations and safeguards model integration.

    Energy Technology Data Exchange (ETDEWEB)

    Cipiti, Benjamin B.; Zinaman, Owen

    2010-09-01

    Research and development of advanced reprocessing plant designs can greatly benefit from the development of a reprocessing plant model capable of transient solvent extraction chemistry. This type of model can be used to optimize the operations of a plant as well as the designs for safeguards, security, and safety. Previous work has integrated a transient solvent extraction simulation module, based on the Solvent Extraction Process Having Interaction Solutes (SEPHIS) code developed at Oak Ridge National Laboratory, with the Separations and Safeguards Performance Model (SSPM) developed at Sandia National Laboratory, as a first step toward creating a more versatile design and evaluation tool. The goal of this work was to strengthen the integration by linking more variables between the two codes. The results from this integrated model show expected operational performance through plant transients. Additionally, ORIGEN source term files were integrated into the SSPM to provide concentrations, radioactivity, neutron emission rate, and thermal power data for various spent fuels. This data was used to generate measurement blocks that can determine the radioactivity, neutron emission rate, or thermal power of any stream or vessel in the plant model. This work examined how the code could be expanded to integrate other separation steps and benchmark the results to other data. Recommendations for future work will be presented.

  8. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  9. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna; Tramontano, Anna; Marcatili, Paolo

    2011-01-01

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  10. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna

    2011-11-10

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  11. Integrated modeling: a look back

    Science.gov (United States)

    Briggs, Clark

    2015-09-01

    This paper discusses applications and implementation approaches used for integrated modeling of structural systems with optics over the past 30 years. While much of the development work focused on control system design, significant contributions were made in system modeling and computer-aided design (CAD) environments. Early work appended handmade line-of-sight models to traditional finite element models, such as the optical spacecraft concept from the ACOSS program. The IDEAS2 computational environment built in support of Space Station collected a wider variety of existing tools around a parametric database. Later, IMOS supported interferometer and large telescope mission studies at JPL with MATLAB modeling of structural dynamics, thermal analysis, and geometric optics. IMOS's predecessor was a simple FORTRAN command line interpreter for LQG controller design with additional functions that built state-space finite element models. Specialized language systems such as CAESY were formulated and prototyped to provide more complex object-oriented functions suited to control-structure interaction. A more recent example of optical modeling directly in mechanical CAD is used to illustrate possible future directions. While the value of directly posing the optical metric in system dynamics terms is well understood today, the potential payoff is illustrated briefly via project-based examples. It is quite likely that integrated structure thermal optical performance (STOP) modeling could be accomplished in a commercial off-the-shelf (COTS) tool set. The work flow could be adopted, for example, by a team developing a small high-performance optical or radio frequency (RF) instrument.

  12. Knowledge Management tools integration within DLR's concurrent engineering facility

    Science.gov (United States)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  13. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  14. INTEGRATED CORPORATE STRATEGY MODEL

    Directory of Open Access Journals (Sweden)

    CATALINA SORIANA SITNIKOV

    2014-02-01

    Full Text Available Corporations are at present operating in demanding and highly unsure periods, facing a mixture of increased macroeconomic need, competitive and capital market dangers, and in many cases, the prospect for significant technical and regulative gap. Throughout these demanding and highly unsure times, the corporations must pay particular attention to corporate strategy. In present times, corporate strategy must be perceived and used as a function of various fields, covers, and characters as well as a highly interactive system. For the corporation's strategy to become a competitive advantage is necessary to understand and also to integrate it in a holistic model to ensure sustainable progress of corporation activities under the optimum conditions of profitability. The model proposed in this paper is aimed at integrating the two strategic models, Hoshin Kanri and Integrated Strategy Model, as well as their consolidation with the principles of sound corporate governance set out by the OECD.

  15. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  16. Green Infrastructure Models and Tools

    Science.gov (United States)

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  17. Bond graphs : an integrating tool for design of mechatronic systems

    International Nuclear Information System (INIS)

    Ould Bouamama, B.

    2011-01-01

    Bond graph is a powerful tool well known for dynamic modelling of multi physical systems: This is the only modelling technique to generate automatically state space or non-linear models using dedicated software tools (CAMP-G, 20-Sim, Symbols, Dymola...). Recently several fundamental theories have been developed for using a bond graph model not only for modeling but also as a real integrated tool from conceptual ideas to optimal practical realization of mechatronic system. This keynote presents a synthesis of those new theories which exploit some particular properties (such as causal, structural and behavioral) of this graphical methodology. Based on a pedagogical example, it will be shown how from a physical system (not a transfer function or state equation) and using only one representation (Bond graph), the following results can be performed: modeling (formal state equations generation), Control analysis (observability, controllability, Structural I/O decouplability, dynamic decoupling,...) diagnosis analysis (automatic generation of robust fault indicators, sensor placement, structural diagnosability) and finally sizing of actuators. The presentation will be illustrated by real industrial applications. Limits and perspectives of bond graph theory conclude the keynote.

  18. Development of an integrated model for the Campaspe catchment: a tool to help improve understanding of the interaction between society, policy, farming decision, ecology, hydrology and climate

    Science.gov (United States)

    Iwanaga, Takuya; Zare, Fateme; Croke, Barry; Fu, Baihua; Merritt, Wendy; Partington, Daniel; Ticehurst, Jenifer; Jakeman, Anthony

    2018-06-01

    Management of water resources requires understanding of the hydrology and hydrogeology, as well as the policy and human drivers and their impacts. This understanding requires relevant inputs from a wide range of disciplines, which will vary depending on the specific case study. One approach to gain understanding of the impact of climate and society on water resources is through the use of an integrated modelling process that engages stakeholders and experts in specifics of problem framing, co-design of the underpinning conceptual model, and discussion of the ensuing results. In this study, we have developed such an integrated modelling process for the Campaspe basin in northern Victoria, Australia. The numerical model built has a number of components: - Node/link based surface water hydrology module based on the IHACRES rainfall-streamflow model - Distributed groundwater model for the lower catchment (MODFLOW) - Farm decision optimisation module (to determine irrigation requirements) - Policy module (setting conditions on availability of water based on existing rules) - Ecology module (determining the impacts of available streamflow on platypus, fish and river red gum trees) The integrated model is component based and has been developed in Python, with the MODFLOW and surface water hydrology model run in external programs, controlled by the master program (in Python). The integrated model has been calibrated using historical data, with the intention of exploring the impact of various scenarios (future climate scenarios, different policy options, water management options) on the water resources. The scenarios were selected based on workshops with, and a social survey of, stakeholders in the basin regarding what would be socially acceptable and physically plausible options for changes in management. An example of such a change is the introduction of a managed aquifer recharge system to capture dam overflows, and store at least a portion of this in the aquifer

  19. Challenges in horizontal model integration.

    Science.gov (United States)

    Kolczyk, Katrin; Conradi, Carsten

    2016-03-11

    Systems Biology has motivated dynamic models of important intracellular processes at the pathway level, for example, in signal transduction and cell cycle control. To answer important biomedical questions, however, one has to go beyond the study of isolated pathways towards the joint study of interacting signaling pathways or the joint study of signal transduction and cell cycle control. Thereby the reuse of established models is preferable, as it will generally reduce the modeling effort and increase the acceptance of the combined model in the field. Obtaining a combined model can be challenging, especially if the submodels are large and/or come from different working groups (as is generally the case, when models stored in established repositories are used). To support this task, we describe a semi-automatic workflow based on established software tools. In particular, two frequent challenges are described: identification of the overlap and subsequent (re)parameterization of the integrated model. The reparameterization step is crucial, if the goal is to obtain a model that can reproduce the data explained by the individual models. For demonstration purposes we apply our workflow to integrate two signaling pathways (EGF and NGF) from the BioModels Database.

  20. Integrating New Technologies and Existing Tools to Promote Programming Learning

    Directory of Open Access Journals (Sweden)

    Álvaro Santos

    2010-04-01

    Full Text Available In recent years, many tools have been proposed to reduce programming learning difficulties felt by many students. Our group has contributed to this effort through the development of several tools, such as VIP, SICAS, OOP-Anim, SICAS-COL and H-SICAS. Even though we had some positive results, the utilization of these tools doesn’t seem to significantly reduce weaker student’s difficulties. These students need stronger support to motivate them to get engaged in learning activities, inside and outside classroom. Nowadays, many technologies are available to create contexts that may help to accomplish this goal. We consider that a promising path goes through the integration of solutions. In this paper we analyze the features, strengths and weaknesses of the tools developed by our group. Based on these considerations we present a new environment, integrating different types of pedagogical approaches, resources, tools and technologies for programming learning support. With this environment, currently under development, it will be possible to review contents and lessons, based on video and screen captures. The support for collaborative tasks is another key point to improve and stimulate different models of teamwork. The platform will also allow the creation of various alternative models (learning objects for the same subject, enabling personalized learning paths adapted to each student knowledge level, needs and preferential learning styles. The learning sequences will work as a study organizer, following a suitable taxonomy, according to student’s cognitive skills. Although the main goal of this environment is to support students with more difficulties, it will provide a set of resources supporting the learning of more advanced topics. Software engineering techniques and representations, object orientation and event programming are features that will be available in order to promote the learning progress of students.

  1. Force feedback facilitates multisensory integration during robotic tool use

    NARCIS (Netherlands)

    Sengül, A.; Rognini, G.; van Elk, M.; Aspell, J.E.; Bleuler, H.; Blanke, O.

    2013-01-01

    The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal

  2. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  3. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  4. The systems integration modeling system

    International Nuclear Information System (INIS)

    Danker, W.J.; Williams, J.R.

    1990-01-01

    This paper discusses the systems integration modeling system (SIMS), an analysis tool for the detailed evaluation of the structure and related performance of the Federal Waste Management System (FWMS) and its interface with waste generators. It's use for evaluations in support of system-level decisions as to FWMS configurations, the allocation, sizing, balancing and integration of functions among elements, and the establishment of system-preferred waste selection and sequencing methods and other operating strategies is presented. SIMS includes major analysis submodels which quantify the detailed characteristics of individual waste items, loaded casks and waste packages, simulate the detailed logistics of handling and processing discrete waste items and packages, and perform detailed cost evaluations

  5. Integrated Design Tools for Embedded Control Systems

    NARCIS (Netherlands)

    Jovanovic, D.S.; Hilderink, G.H.; Broenink, Johannes F.; Karelse, F.

    2001-01-01

    Currently, computer-based control systems are still being implemented using the same techniques as 10 years ago. The purpose of this project is the development of a design framework, consisting of tools and libraries, which allows the designer to build high reliable heterogeneous real-time embedded

  6. Lessons learned from tool integration with OSLC

    NARCIS (Netherlands)

    Leitner, A.; Herbst, B.; Mathijssen, R.

    2016-01-01

    Today’s embedded and cyber-physical systems are getting more connected and complex. One main challenge during development is the often loose coupling between engineering tools, which could lead to inconsistencies and errors due to the manual transfer and duplication of data. Open formats and

  7. Mental model mapping as a new tool to analyse the use of information in decision-making in integrated water management

    Science.gov (United States)

    Kolkman, M. J.; Kok, M.; van der Veen, A.

    , uncertainty and disagreement) can be positioned in the framework, as can the communities of knowledge construction and valuation involved in the solution of these problems (core science, applied science, and professional consultancy, and “post-normal” science). Mental model maps, this research hypothesises, are suitable to analyse the above aspects of the problem. This hypothesis is tested for the case of the Zwolle storm surch barrier. Analysis can aid integration between disciplines, participation of public stakeholders, and can stimulate learning processes. Mental model mapping is recommended to visualise the use of knowledge, to analyse difficulties in problem solving process, and to aid information transfer and communication. Mental model mapping help scientists to shape their new, post-normal responsibilities in a manner that complies with integrity when dealing with unstructured problems in complex, multifunctional systems.

  8. Integrated Design Tools for Embedded Control Systems

    OpenAIRE

    Jovanovic, D.S.; Hilderink, G.H.; Broenink, Johannes F.; Karelse, F.

    2001-01-01

    Currently, computer-based control systems are still being implemented using the same techniques as 10 years ago. The purpose of this project is the development of a design framework, consisting of tools and libraries, which allows the designer to build high reliable heterogeneous real-time embedded systems in a very short time at a fraction of the present day costs. The ultimate focus of current research is on transformation control laws to efficient concurrent algorithms, with concerns about...

  9. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  10. Data and Tools | Integrated Energy Solutions | NREL

    Science.gov (United States)

    cash flow model for assessing projects, designing cost-based incentives, and evaluating the impact of and incentives, renewable energy resources, fuel costs, and more for a particular city and state or zip code System Advisor Model (SAM). Performance and financial model designed to help estimate costs

  11. Diverse methods for integrable models

    NARCIS (Netherlands)

    Fehér, G.

    2017-01-01

    This thesis is centered around three topics, sharing integrability as a common theme. This thesis explores different methods in the field of integrable models. The first two chapters are about integrable lattice models in statistical physics. The last chapter describes an integrable quantum chain.

  12. Risk Informed Design Using Integrated Vehicle Rapid Assessment Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — A successful proof of concept was performed in FY 2012 integrating the Envision tool for parametric estimates of vehicle mass and the Rapid Response Risk Assessment...

  13. Integrated Visualization Environment for Science Mission Modeling, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA is emphasizing the use of larger, more integrated models in conjunction with systems engineering tools and decision support systems. These tools place a...

  14. Integrating Risk Analyses and Tools at the DOE Hanford Site

    International Nuclear Information System (INIS)

    LOBER, R.W.

    2002-01-01

    Risk assessment and environmental impact analysis at the U.S. Department of Energy (DOE) Hanford Site in Washington State has made significant progress in refining the strategy for using risk analysis to support closing of several hundred waste sites plus 149 single-shell tanks at the Hanford Site. A Single-Shell Tank System Closure Work Plan outlines the current basis for closing the single-shell tank systems. An analogous site approach has been developed to address closure of aggregated groups of similar waste sites. Because of the complexity, decision time frames, proximity of non-tank farm waste sites to tank farms, scale, and regulatory considerations, various projects are providing integrated assessments to support risk analyses and decision-making. Projects and the tools that are being developed and applied at Hanford to support retrieval and cleanup decisions include: (1) Life Cycle Model (LCM) and Risk Receptor Model (RRM)--A site-level set of tools to support strategic analyses through scoping level risk management to assess different alternatives and options for tank closure. (2) Systems Assessment Capability for Integrated Groundwater Nadose Zone (SAC) and the Site-Wide Groundwater Model (SWGM)--A site-wide groundwater modeling system coupled with a risk-based uncertainty analysis of inventory, vadose zone, groundwater, and river interactions for evaluating cumulative impacts from individual and aggregate waste sites. (3) Retrieval Performance Evaluation (RPE)--A site-specific, risk-based methodology developed to evaluate performance of waste retrieval, leak detection and closure on a tank-specific basis as a function of past tank Leaks, potential leakage during retrieval operations, and remaining residual waste inventories following completion of retrieval operations. (4) Field Investigation Report (FIR)--A corrective action program to investigate the nature and extent of past tank leaks through characterization activities and assess future impacts to

  15. A tool to guide the process of integrating health system responses to public health problems

    Directory of Open Access Journals (Sweden)

    Tilahun Nigatu Haregu

    2015-06-01

    Full Text Available An integrated model of health system responses to public health problems is considered to be the most preferable approach. Accordingly, there are several models that stipulate what an integrated architecture should look like. However, tools that can guide the overall process of integration are lacking. This tool is designed to guide the entire process of integration of health system responses to major public health problems. It is developed by taking into account the contexts of health systems of developing countries and the emergence of double-burden of chronic diseases in these settings. Chronic diseases – HIV/AIDS and NCDs – represented the evidence base for the development of the model. System level horizontal integration of health system responses were considered in the development of this tool.

  16. An integrated multi-criteria scenario evaluation web tool for participatory land-use planning in urbanized areas: The Ecosystem Portfolio Model

    Science.gov (United States)

    Labiosa, Bill; Forney, William M.; Hearn,, Paul P.; Hogan, Dianna M.; Strong, David R.; Swain, Eric D.; Esnard, Ann-Margaret; Mitsova-Boneva, D.; Bernknopf, R.; Pearlstine, Leonard; Gladwin, Hugh

    2013-01-01

    Land-use land-cover change is one of the most important and direct drivers of changes in ecosystem functions and services. Given the complexity of the decision-making, there is a need for Internet-based decision support systems with scenario evaluation capabilities to help planners, resource managers and communities visualize, compare and consider trade-offs among the many values at stake in land use planning. This article presents details on an Ecosystem Portfolio Model (EPM) prototype that integrates ecological, socio-economic information and associated values of relevance to decision-makers and stakeholders. The EPM uses a multi-criteria scenario evaluation framework, Geographic Information Systems (GIS) analysis and spatially-explicit land-use/land-cover change-sensitive models to characterize changes in important land-cover related ecosystem values related to ecosystem services and functions, land parcel prices, and community quality-of-life (QoL) metrics. Parameters in the underlying models can be modified through the interface, allowing users in a facilitated group setting to explore simultaneously issues of scientific uncertainty and divergence in the preferences of stakeholders. One application of the South Florida EPM prototype reported in this article shows the modeled changes (which are significant) in aggregate ecological value, landscape patterns and fragmentation, biodiversity potential and ecological restoration potential for current land uses compared to the 2050 land-use scenario. Ongoing refinements to EPM, and future work especially in regard to modifiable sea level rise scenarios are also discussed.

  17. Integration of tools for binding archetypes to SNOMED CT.

    Science.gov (United States)

    Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan

    2008-10-27

    The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.

  18. Data assimilation in integrated hydrological modelling

    DEFF Research Database (Denmark)

    Rasmussen, Jørn

    Integrated hydrological models are useful tools for water resource management and research, and advances in computational power and the advent of new observation types has resulted in the models generally becoming more complex and distributed. However, the models are often characterized by a high...... degree of parameterization which results in significant model uncertainty which cannot be reduced much due to observations often being scarce and often taking the form of point measurements. Data assimilation shows great promise for use in integrated hydrological models , as it allows for observations...... to be efficiently combined with models to improve model predictions, reduce uncertainty and estimate model parameters. In this thesis, a framework for assimilating multiple observation types and updating multiple components and parameters of a catchment scale integrated hydrological model is developed and tested...

  19. Integrated Medical Model Overview

    Science.gov (United States)

    Myers, J.; Boley, L.; Foy, M.; Goodenow, D.; Griffin, D.; Keenan, A.; Kerstman, E.; Melton, S.; McGuire, K.; Saile, L.; hide

    2015-01-01

    The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project.

  20. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  1. Application of the Soil and Water Assessment Tool (SWAT Model on a small tropical island (Great River Watershed, Jamaica as a tool in Integrated Watershed and Coastal Zone Management

    Directory of Open Access Journals (Sweden)

    Orville P. Grey

    2014-09-01

    Full Text Available The Great River Watershed, located in north-west Jamaica, is critical for development, particularly for housing, tourism, agriculture, and mining. It is a source of sediment and nutrient loading to the coastal environment including the Montego Bay Marine Park. We produced a modeling framework using the Soil and Water Assessment Tool (SWAT and GIS. The calculated model performance statistics for high flow discharge yielded a Nash-Sutcliffe Efficiency (NSE value of 0.68 and a R² value of 0.70 suggesting good measured and simulated (calibrated discharge correlation. Calibration and validation results for streamflow were similar to the observed streamflows. For the dry season the simulated urban landuse scenario predicted an increase in surface runoff in excess of 150%. During the wet season it is predicted to range from 98 to 234% presenting a significant risk of flooding, erosion and other environmental issues. The model should be used for the remaining 25 watersheds in Jamaica and elsewhere in the Caribbean. The models suggests that projected landuse changes will have serious impacts on available water (streamflow, stream health, potable water treatment, flooding and sensitive coastal ecosystems.

  2. The Integrated Air Transportation System Evaluation Tool

    Science.gov (United States)

    Wingrove, Earl R., III; Hees, Jing; Villani, James A.; Yackovetsky, Robert E. (Technical Monitor)

    2002-01-01

    Throughout U.S. history, our nation has generally enjoyed exceptional economic growth, driven in part by transportation advancements. Looking forward 25 years, when the national highway and skyway systems are saturated, the nation faces new challenges in creating transportation-driven economic growth and wealth. To meet the national requirement for an improved air traffic management system, NASA developed the goal of tripling throughput over the next 20 years, in all weather conditions while maintaining safety. Analysis of the throughput goal has primarily focused on major airline operations, primarily through the hub and spoke system.However, many suggested concepts to increase throughput may operate outside the hub and spoke system. Examples of such concepts include the Small Aircraft Transportation System, civil tiltrotor, and improved rotorcraft. Proper assessment of the potential contribution of these technologies to the domestic air transportation system requires a modeling capability that includes the country's numerous smaller airports, acting as a fundamental component of the National Air space System, and the demand for such concepts and technologies. Under this task for NASA, the Logistics Management Institute developed higher fidelity demand models that capture the interdependence of short-haul air travel with other transportation modes and explicitly consider the costs of commercial air and other transport modes. To accomplish this work, we generated forecasts of the distribution of general aviation based aircraft and GA itinerant operations at each of nearly 3.000 airport based on changes in economic conditions and demographic trends. We also built modules that estimate the demand for travel by different modes, particularly auto, commercial air, and GA. We examined GA demand from two perspectives: top-down and bottom-up, described in detail.

  3. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    Science.gov (United States)

    Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania

    2007-05-01

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  4. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  5. Integrative structure modeling with the Integrative Modeling Platform.

    Science.gov (United States)

    Webb, Benjamin; Viswanath, Shruthi; Bonomi, Massimiliano; Pellarin, Riccardo; Greenberg, Charles H; Saltzberg, Daniel; Sali, Andrej

    2018-01-01

    Building models of a biological system that are consistent with the myriad data available is one of the key challenges in biology. Modeling the structure and dynamics of macromolecular assemblies, for example, can give insights into how biological systems work, evolved, might be controlled, and even designed. Integrative structure modeling casts the building of structural models as a computational optimization problem, for which information about the assembly is encoded into a scoring function that evaluates candidate models. Here, we describe our open source software suite for integrative structure modeling, Integrative Modeling Platform (https://integrativemodeling.org), and demonstrate its use. © 2017 The Protein Society.

  6. Integrated Land-Water-Energy assessment using the Foreseer Tool

    Science.gov (United States)

    Allwood, Julian; Konadu, Dennis; Mourao, Zenaida; Lupton, Rick; Richards, Keith; Fenner, Richard; Skelton, Sandy; McMahon, Richard

    2016-04-01

    This study presents an integrated energy and resource modelling and visualisation approach, ForeseerTM, which characterises the interdependencies and evaluates the land and water requirement for energy system pathways. The Foreseer Tool maps linked energy, water and land resource futures by outputting a set of Sankey diagrams for energy, water and land, showing the flow from basic resource (e.g. coal, surface water, and forested land) through transformations (e.g. fuel refining and desalination) to final services (e.g. sustenance, hygiene and transportation). By 'mapping' resources in this way, policy-makers can more easily understand the competing uses through the identification of the services it delivers (e.g. food production, landscaping, energy), the potential opportunities for improving the management of the resource and the connections with other resources which are often overlooked in a traditional sector-based management strategy. This paper will present a case study of the UK Carbon Plan, and highlights the need for integrated resource planning and policy development.

  7. Business and technology integrated model

    OpenAIRE

    Noce, Irapuan; Carvalho, João Álvaro

    2011-01-01

    There is a growing interest in business modeling and architecture in the areas of management and information systems. One of the issues in the area is the lack of integration between the modeling techniques that are employed to support business development and those used for technology modeling. This paper proposes a modeling approach that is capable of integrating the modeling of the business and of the technology. By depicting the business model, the organization structure and the technolog...

  8. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  9. Integration between a sales support system and a simulation tool

    OpenAIRE

    Wahlström, Ola

    2005-01-01

    InstantPlanner is a sales support system for the material handling industry, visualizing and calculating designs faster and more correctly than other tools on the market. AutoMod is a world leading simulation tool used in the material handling industry to optimize and calculate appropriate configuration designs. Both applications are favorable in their own area provide a great platform for integration with the properties of fast designing, correct product calculations, great simulation capabi...

  10. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  11. An introduction to Space Weather Integrated Modeling

    Science.gov (United States)

    Zhong, D.; Feng, X.

    2012-12-01

    The need for a software toolkit that integrates space weather models and data is one of many challenges we are facing with when applying the models to space weather forecasting. To meet this challenge, we have developed Space Weather Integrated Modeling (SWIM) that is capable of analysis and visualizations of the results from a diverse set of space weather models. SWIM has a modular design and is written in Python, by using NumPy, matplotlib, and the Visualization ToolKit (VTK). SWIM provides data management module to read a variety of spacecraft data products and a specific data format of Solar-Interplanetary Conservation Element/Solution Element MHD model (SIP-CESE MHD model) for the study of solar-terrestrial phenomena. Data analysis, visualization and graphic user interface modules are also presented in a user-friendly way to run the integrated models and visualize the 2-D and 3-D data sets interactively. With these tools we can locally or remotely analysis the model result rapidly, such as extraction of data on specific location in time-sequence data sets, plotting interplanetary magnetic field lines, multi-slicing of solar wind speed, volume rendering of solar wind density, animation of time-sequence data sets, comparing between model result and observational data. To speed-up the analysis, an in-situ visualization interface is used to support visualizing the data 'on-the-fly'. We also modified some critical time-consuming analysis and visualization methods with the aid of GPU and multi-core CPU. We have used this tool to visualize the data of SIP-CESE MHD model in real time, and integrated the Database Model of shock arrival, Shock Propagation Model, Dst forecasting model and SIP-CESE MHD model developed by SIGMA Weather Group at State Key Laboratory of Space Weather/CAS.

  12. Integrated Medical Model (IMM)

    Data.gov (United States)

    National Aeronautics and Space Administration — The IMM project was funded from 1 October 2005 to 31 January 2011, at which point the IMM transitioned to an operational tool used by the International Space Station...

  13. Developing an integration tool for soil contamination assessment

    Science.gov (United States)

    Anaya-Romero, Maria; Zingg, Felix; Pérez-Álvarez, José Miguel; Madejón, Paula; Kotb Abd-Elmabod, Sameh

    2015-04-01

    In the last decades, huge soil areas have been negatively influenced or altered in multiples forms. Soils and, consequently, underground water, have been contaminated by accumulation of contaminants from agricultural activities (fertilizers and pesticides) industrial activities (harmful material dumping, sludge, flying ashes) and urban activities (hydrocarbon, metals from vehicle traffic, urban waste dumping). In the framework of the RECARE project, local partners across Europe are focusing on a wide range of soil threats, as soil contamination, and aiming to develop effective prevention, remediation and restoration measures by designing and applying targeted land management strategies (van Lynden et al., 2013). In this context, the Guadiamar Green Corridor (Southern Spain) was used as a case study, aiming to obtain soil data and new information in order to assess soil contamination. The main threat in the Guadiamar valley is soil contamination after a mine spill occurred on April 1998. About four hm3 of acid waters and two hm3 of mud, rich in heavy metals, were released into the Agrio and Guadiamar rivers affecting more than 4,600 ha of agricultural and pasture land. Main trace elements contaminating soil and water were As, Cd, Cu, Pb, Tl and Zn. The objective of the present research is to develop informatics tools that integrate soil database, models and interactive platforms for soil contamination assessment. Preliminary results were obtained related to the compilation of harmonized databases including geographical, hydro-meteorological, soil and socio-economic variables based on spatial analysis and stakeholder's consultation. Further research will be modellization and upscaling at the European level, in order to obtain a scientifically-technical predictive tool for the assessment of soil contamination.

  14. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  15. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  16. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  17. Advertising Can Be an Effective Integrated Marketing Tool

    Science.gov (United States)

    Lauer, Larry D.

    2007-01-01

    Advertising will not undermine the critical thinking of consumers when it is combined with other communication media, and when it is truthful. In fact, it can provide clarity about the competitive advantage of individual institutions and aid an individual's ability to choose wisely. Advertising is just one of the tools in the integrated marketing…

  18. Uni- and omnidirectional simulation tools for integrated optics

    NARCIS (Netherlands)

    Stoffer, Remco

    2001-01-01

    This thesis presents several improvements on simulation methods in integrated optics, as well as some new methods. Both uni- and omnidirectional tools are presented; for the unidirectional methods, the emphasis is on higher-order accuracy; for the omnidirectional methods, the boundary conditions are

  19. THE MANAGEMENT ACCOUNTING TOOLS AND THE INTEGRATED REPORTING

    Directory of Open Access Journals (Sweden)

    Gabriel JINGA

    2015-04-01

    Full Text Available During the recent years the stakeholders are asking for other pieces of information to be published along with the financial one, such as risk reporting, intangibles, social and environmental accounting. The type of corporate reporting which incorporates the elements enumerated above is the integrated reporting. In this article, we argue that the information disclosed in the integrated reports is prepared by the management accounting, not only by the financial accounting. Thus, we search for the management accounting tools which are used by the companies which prepare integrated reports. In order to do this, we analytically reviewed all the reports available on the website of a selected company. Our results show that the company is using most of the management accounting tools mentioned in the literature review part.

  20. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  1. Tool Integration: Experiences and Issues in Using XMI and Component Technology

    DEFF Research Database (Denmark)

    Damm, Christian Heide; Hansen, Klaus Marius; Thomsen, Michael

    2000-01-01

    of conflicting data models, and provide architecture for doing so, based on component technology and XML Metadata Interchange. As an example, we discuss the implementation of an electronic whiteboard tool, Knight, which adds support for creative and collaborative object-oriented modeling to existing Computer-Aided...... Software Engineering through integration using our proposed architecture....

  2. Integrated catchment modelling in a Semi-arid area

    CSIR Research Space (South Africa)

    Bugan, Richard DH

    2010-09-01

    Full Text Available , will increasingly need water quality and quantity management tools to be able to make informed decisions. Integrated catchment modelling (ICM) is regarded as being a valuable tool for integrated water resource management. It enables officials and scientists to make...

  3. Integrability of the Rabi Model

    International Nuclear Information System (INIS)

    Braak, D.

    2011-01-01

    The Rabi model is a paradigm for interacting quantum systems. It couples a bosonic mode to the smallest possible quantum model, a two-level system. I present the analytical solution which allows us to consider the question of integrability for quantum systems that do not possess a classical limit. A criterion for quantum integrability is proposed which shows that the Rabi model is integrable due to the presence of a discrete symmetry. Moreover, I introduce a generalization with no symmetries; the generalized Rabi model is the first example of a nonintegrable but exactly solvable system.

  4. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    Science.gov (United States)

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  5. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  6. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  7. Models, Databases, and Simulation Tools Needed for the Realization of Integrated Computational Materials Engineering. Proceedings of the Symposium Held at Materials Science and Technology 2010

    Science.gov (United States)

    Arnold, Steven M. (Editor); Wong, Terry T. (Editor)

    2011-01-01

    Topics covered include: An Annotative Review of Multiscale Modeling and its Application to Scales Inherent in the Field of ICME; and A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures.

  8. Critical chain project management and drum-buffer-rope tools integration in construction industry - case study

    Directory of Open Access Journals (Sweden)

    Piotr Cyplik

    2012-03-01

    Full Text Available Background: The concept of integrating the theory of constraints tools in reorganizing management system in a mechanical engineering company was presented in this article. The main aim of the concept is to enable the enterprise to satisfy the customers' expectations at reasonable costs, which allows for making a profit and creating an agile enterprise in the long run. Methods: Due to the individual character of the production process and service process in analyzed company, the described concept using theory of constraints project management (CCPM and manufacturing (DBR tools. The authors use performance levels conception to build an integration tool focused on the interaction and collaboration between different departments. The integration tool has been developed and verified in Polish manufacturing company. Results: In described model a tool compatible with CCPM operates on the level of the customer service process. Shop floor is controlled based on the DBR method. The authors hold that the integration of between TOC tools is of key importance. The integration of TOC tools dedicated to managing customer service and shop floor scheduling and controlling requires developing a mechanism for repeated transmitting the information between them. This mechanism has been developed. Conclusions: The conducted research showed that the developed tool integrating CCPM and DBR had a positive impact on the enterprise performance. It enables improving the company performance in meeting target group requirements by focusing on enhancing the efficiency of processes running in the company and tasks processed at particular work stations. The described model has been successfully implemented in one of the Polish mechanical engineering companies.

  9. IMMIGRANTS’ INTEGRATION MODELS

    Directory of Open Access Journals (Sweden)

    CARMEN UZLĂU

    2012-05-01

    Full Text Available In the context of the European population aging trend, and while the birth rate is still at a low level, the immigrants may contribute to the support of the EU economy and to finance the national social protection systems. But this would be possible only if they have been fully integrated in the host countries, the integration policies being a task of the national governments. The European Union may still offer support and stimulation through financing, policies coordination and good practices exchange facilitation. The new measures should encourage local level actions, including cooperation between local authorities, employers, migrants’ organizations, service providers and local population. Within the EU, there live 20.1 million immigrants (approximately 4% of the entire population coming from outside European area. An important element of the common EU policy on immigration is the one regarding the development of a policy on immigrants’ integration, which should provide a fair treatment within the member states, and guarantee rights and obligations comparable with the ones of the Union citizens.

  10. Integrated Control Modeling for Propulsion Systems Using NPSS

    Science.gov (United States)

    Parker, Khary I.; Felder, James L.; Lavelle, Thomas M.; Withrow, Colleen A.; Yu, Albert Y.; Lehmann, William V. A.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS), an advanced engineering simulation environment used to design and analyze aircraft engines, has been enhanced by integrating control development tools into it. One of these tools is a generic controller interface that allows NPSS to communicate with control development software environments such as MATLAB and EASY5. The other tool is a linear model generator (LMG) that gives NPSS the ability to generate linear, time-invariant state-space models. Integrating these tools into NPSS enables it to be used for control system development. This paper will discuss the development and integration of these tools into NPSS. In addition, it will show a comparison of transient model results of a generic, dual-spool, military-type engine model that has been implemented in NPSS and Simulink. It will also show the linear model generator s ability to approximate the dynamics of a nonlinear NPSS engine model.

  11. MMM: A toolbox for integrative structure modeling.

    Science.gov (United States)

    Jeschke, Gunnar

    2018-01-01

    Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.

  12. Model Checking Markov Chains: Techniques and Tools

    NARCIS (Netherlands)

    Zapreev, I.S.

    2008-01-01

    This dissertation deals with four important aspects of model checking Markov chains: the development of efficient model-checking tools, the improvement of model-checking algorithms, the efficiency of the state-space reduction techniques, and the development of simulation-based model-checking

  13. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation......With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer’s......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  14. Integrable quantum impurity models

    International Nuclear Information System (INIS)

    Eckle, H.P.

    1998-01-01

    By modifying some of the local L operators of the algebraic form of the Bethe Ansatz inhomogeneous one dimensional quantum lattice models can be constructed. This fact has recently attracted new attention, the inhomogeneities being interpreted as local impurities. The Hamiltonians of the so constructed one-dimensional quantum models have a nearest neighbour structure except in the vicinity of the local impurities which involve three-site interactions. The pertinent feature of these models is the absence of backscattering at the impurities: the impurities are transparent. (Copyright (1998) World Scientific Publishing Co. Pte. Ltd)

  15. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  16. Integrating Wind Profiling Radars and Radiosonde Observations with Model Point Data to Develop a Decision Support Tool to Assess Upper-Level Winds for Space Launch

    Science.gov (United States)

    Bauman, William H., III; Flinn, Clay

    2013-01-01

    On the day of launch, the 45th Weather Squadron (45 WS) Launch Weather Officers (LWOs) monitor the upper-level winds for their launch customers. During launch operations, the payload/launch team sometimes asks the LWOs if they expect the upper-level winds to change during the countdown. The LWOs used numerical weather prediction model point forecasts to provide the information, but did not have the capability to quickly retrieve or adequately display the upper-level observations and compare them directly in the same display to the model point forecasts to help them determine which model performed the best. The LWOs requested the Applied Meteorology Unit (AMU) develop a graphical user interface (GUI) that will plot upper-level wind speed and direction observations from the Cape Canaveral Air Force Station (CCAFS) Automated Meteorological Profiling System (AMPS) rawinsondes with point forecast wind profiles from the National Centers for Environmental Prediction (NCEP) North American Mesoscale (NAM), Rapid Refresh (RAP) and Global Forecast System (GFS) models to assess the performance of these models. The AMU suggested adding observations from the NASA 50 MHz wind profiler and one of the US Air Force 915 MHz wind profilers, both located near the Kennedy Space Center (KSC) Shuttle Landing Facility, to supplement the AMPS observations with more frequent upper-level profiles. Figure 1 shows a map of KSC/CCAFS with the locations of the observation sites and the model point forecasts.

  17. Integration of g4tools in Geant4

    International Nuclear Information System (INIS)

    Hřivnáčová, Ivana

    2014-01-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  18. Gravitational interactions of integrable models

    International Nuclear Information System (INIS)

    Abdalla, E.; Abdalla, M.C.B.

    1995-10-01

    We couple non-linear σ-models to Liouville gravity, showing that integrability properties of symmetric space models still hold for the matter sector. Using similar arguments for the fermionic counterpart, namely Gross-Neveu-type models, we verify that such conclusions must also hold for them, as recently suggested. (author). 18 refs

  19. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  20. Spatial Modeling Tools for Cell Biology

    National Research Council Canada - National Science Library

    Przekwas, Andrzej; Friend, Tom; Teixeira, Rodrigo; Chen, Z. J; Wilkerson, Patrick

    2006-01-01

    .... Scientific potentials and military relevance of computational biology and bioinformatics have inspired DARPA/IPTO's visionary BioSPICE project to develop computational framework and modeling tools for cell biology...

  1. Integrated Debugging of Modelica Models

    Directory of Open Access Journals (Sweden)

    Adrian Pop

    2014-04-01

    Full Text Available The high abstraction level of equation-based object-oriented (EOO languages such as Modelica has the drawback that programming and modeling errors are often hard to find. In this paper we present integrated static and dynamic debugging methods for Modelica models and a debugger prototype that addresses several of those problems. The goal is an integrated debugging framework that combines classical debugging techniques with special techniques for equation-based languages partly based on graph visualization and interaction. To our knowledge, this is the first Modelica debugger that supports both equation-based transformational and algorithmic code debugging in an integrated fashion.

  2. Mental model mapping as a new tool to analyse the use of information in decision-making in integrated water management

    NARCIS (Netherlands)

    Kolkman, Rien; Kok, Matthijs; van der Veen, A.

    2005-01-01

    The solution of complex, unstructured problems is faced with policy controversy and dispute, unused and misused knowledge, project delay and failure, and decline of public trust in governmental decisions. Mental model mapping (also called concept mapping) is a technique to analyse these difficulties

  3. Integrated Inflammatory Stress (ITIS) Model

    DEFF Research Database (Denmark)

    Bangsgaard, Elisabeth O.; Hjorth, Poul G.; Olufsen, Mette S.

    2017-01-01

    maintains a long-term level of the stress hormone cortisol which is also anti-inflammatory. A new integrated model of the interaction between these two subsystems of the inflammatory system is proposed and coined the integrated inflammatory stress (ITIS) model. The coupling mechanisms describing....... A constant activation results in elevated levels of the variables in the model while a prolonged change of the oscillations in ACTH and cortisol concentrations is the most pronounced result of different LPS doses predicted by the model....

  4. Application of an integrated Weather Research and Forecasting (WRF)/CALPUFF modeling tool for source apportionment of atmospheric pollutants for air quality management: A case study in the urban area of Benxi, China.

    Science.gov (United States)

    Wu, Hao; Zhang, Yan; Yu, Qi; Ma, Weichun

    2018-04-01

    In this study, the authors endeavored to develop an effective framework for improving local urban air quality on meso-micro scales in cities in China that are experiencing rapid urbanization. Within this framework, the integrated Weather Research and Forecasting (WRF)/CALPUFF modeling system was applied to simulate the concentration distributions of typical pollutants (particulate matter with an aerodynamic diameter air quality to different degrees. According to the type-based classification, which categorized the pollution sources as belonging to the Bengang Group, large point sources, small point sources, and area sources, the source apportionment showed that the Bengang Group, the large point sources, and the area sources had considerable impacts on urban air quality. Finally, combined with the industrial characteristics, detailed control measures were proposed with which local policy makers could improve the urban air quality in Benxi. In summary, the results of this study showed that this framework has credibility for effectively improving urban air quality, based on the source apportionment of atmospheric pollutants. The authors endeavored to build up an effective framework based on the integrated WRF/CALPUFF to improve the air quality in many cities on meso-micro scales in China. Via this framework, the integrated modeling tool is accurately used to study the characteristics of meteorological fields, concentration fields, and source apportionments of pollutants in target area. The impacts of classified sources on air quality together with the industrial characteristics can provide more effective control measures for improving air quality. Through the case study, the technical framework developed in this study, particularly the source apportionment, could provide important data and technical support for policy makers to assess air pollution on the scale of a city in China or even the world.

  5. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels......, and it is investigated how a non-uniform airflow influences the refrigerant mass flow rate distribution and the total cooling capacity of the heat exchanger. It is shown that the cooling capacity decreases significantly with increasing maldistribution of the airflow. Comparing the two simulation tools it is found...

  6. Ground Vehicle System Integration (GVSI) and Design Optimization Model

    National Research Council Canada - National Science Library

    Horton, William

    1996-01-01

    This report documents the Ground Vehicle System Integration (GVSI) and Design Optimization Model GVSI is a top-level analysis tool designed to support engineering tradeoff studies and vehicle design optimization efforts...

  7. Integrated Medical Model – Chest Injury Model

    Data.gov (United States)

    National Aeronautics and Space Administration — The Exploration Medical Capability (ExMC) Element of NASA's Human Research Program (HRP) developed the Integrated Medical Model (IMM) to forecast the resources...

  8. Integrated Modelling - the next steps (Invited)

    Science.gov (United States)

    Moore, R. V.

    2010-12-01

    Integrated modelling (IM) has made considerable advances over the past decade but it has not yet been taken up as an operational tool in the way that its proponents had hoped. The reasons why will be discussed in Session U17. This talk will propose topics for a research and development programme and suggest an institutional structure which, together, could overcome the present obstacles. Their combined aim would be first to make IM into an operational tool useable by competent public authorities and commercial companies and, in time, to see it evolve into the modelling equivalent of Google Maps, something accessible and useable by anyone with a PC or an iphone and an internet connection. In a recent study, a number of government agencies, water authorities and utilities applied integrated modelling to operational problems. While the project demonstrated that IM could be used in an operational setting and had benefit, it also highlighted the advances that would be required for its widespread uptake. These were: greatly improving the ease with which models could be a) made linkable, b) linked and c) run; developing a methodology for applying integrated modelling; developing practical options for calibrating and validating linked models; addressing the science issues that arise when models are linked; extending the range of modelling concepts that can be linked; enabling interface standards to pass uncertainty information; making the interface standards platform independent; extending the range of platforms to include those for high performance computing; developing the concept of modelling components as web services; separating simulation code from the model’s GUI, so that all the results from the linked models can be viewed through a single GUI; developing scenario management systems so that that there is an audit trail of the version of each model and dataset used in each linked model run. In addition to the above, there is a need to build a set of integrated

  9. Integrated Model of Bioenergy and Agriculture System

    DEFF Research Database (Denmark)

    Sigurjonsson, Hafthor Ægir; Elmegaard, Brian; Clausen, Lasse Røngaard

    2015-01-01

    Due to increased burden on the environment caused by human activities, focus on industrial ecology designs are gaining more attention. In that perspective an environ- mentally effective integration of bionergy and agriculture systems has significant potential. This work introduces a modeling...... of the overall model. C- TOOL and Yasso07 are used in the carbon balance of agri- culture, Dynamic Network Analysis is used for the energy simulation and Brightway2 is used to build a Life Cycle Inventory compatible database and processes it for vari- ous impacts assessment methods. The model is success- fully...... approach that builds on Life Cycle Inventory and carries out Life Cycle Impact Assessment for a con- sequential Life Cycle Assessment on integrated bioenergy and agriculture systems. The model framework is built in Python which connects various freely available soft- ware that handle different aspects...

  10. The integrated economic model

    International Nuclear Information System (INIS)

    Syrota, J.; Cirelli, J.F.; Brimont, S.; Lyle, C.; Nossent, G.; Moraleda, P.

    2005-01-01

    The setting up of the European energy market has triggered a radical change of the context within with the energy players operated. The natural markets of the incumbent operators, which were formerly demarcated by national and even regional borders, have extended to at least the scale of the European Union. In addition to their geographical development strategy, gas undertakings are diversifying their portfolios towards both upstream as well as downstream activities of the gas chain, and/or extending their offers to other energies and services. Energy players' strategies are rather complex and sometimes give the impression that of being based on contradictory decisions. Some operators widen their field of operations, whereas others specialize in a limited number of activities. This Round Table provides an opportunity to compare business models as adopted by the major gas undertakings in response to structural changes observed in various countries over recent years

  11. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  12. An Integrative Review of Pediatric Fall Risk Assessment Tools.

    Science.gov (United States)

    DiGerolamo, Kimberly; Davis, Katherine Finn

    Patient fall prevention begins with accurate risk assessment. However, sustained improvements in prevention and quality of care include use of validated fall risk assessment tools (FRATs). The goal of FRATs is to identify patients at highest risk. Adult FRATs are often borrowed from to create tools for pediatric patients. Though factors associated with pediatric falls in the hospital setting are similar to those in adults, such as mobility, medication use, and cognitive impairment, adult FRATs and the factors associated with them do not adequately assess risk in children. Articles were limited to English language, ages 0-21years, and publish date 2006-2015. The search yielded 22 articles. Ten were excluded as the population was primarily adult or lacked discussion of a FRAT. Critical appraisal and findings were synthesized using the Johns Hopkins Nursing evidence appraisal system. Twelve articles relevant to fall prevention in the pediatric hospital setting that discussed fall risk assessment and use of a FRAT were reviewed. Comparison between and accuracy of FRATs is challenged when different classifications, definitions, risk stratification, and inclusion criteria are used. Though there are several pediatric FRATs published in the literature, none have been found to be reliable and valid across institutions and diverse populations. This integrative review highlights the importance of choosing a FRAT based on an institution's identified risk factors and validating the tool for one's own patient population as well as using the tool in conjunction with nursing clinical judgment to guide interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Integrated environmental decision support tool based on GIS technology

    International Nuclear Information System (INIS)

    Doctor, P.G.; O'Neil, T.K.; Sackschewsky, M.R.; Becker, J.M.; Rykiel, E.J.; Walters, T.B.; Brandt, C.A.; Hall, J.A.

    1995-01-01

    Environmental restoration and management decisions facing the US Department of Energy require balancing trade-offs between diverse land uses and impacts over multiple spatial and temporal scales. Many types of environmental data have been collected for the Hanford Site and the Columbia River in Washington State over the past fifty years. Pacific Northwest National Laboratory (PNNL) is integrating these data into a Geographic Information System (GIS) based computer decision support tool. This tool provides a comprehensive and concise description of the current environmental landscape that can be used to evaluate the ecological and monetary trade-offs between future land use, restoration and remediation options before action is taken. Ecological impacts evaluated include effects to individual species of concern and habitat loss and fragmentation. Monetary impacts include those associated with habitat mitigation. The tool is organized as both a browsing tool for educational purposes, and as a framework that leads a project manager through the steps needed to be in compliance with environmental requirements

  14. Graph and model transformation tools for model migration : empirical results from the transformation tool contest

    NARCIS (Netherlands)

    Rose, L.M.; Herrmannsdoerfer, M.; Mazanek, S.; Van Gorp, P.M.E.; Buchwald, S.; Horn, T.; Kalnina, E.; Koch, A.; Lano, K.; Schätz, B.; Wimmer, M.

    2014-01-01

    We describe the results of the Transformation Tool Contest 2010 workshop, in which nine graph and model transformation tools were compared for specifying model migration. The model migration problem—migration of UML activity diagrams from version 1.4 to version 2.2—is non-trivial and practically

  15. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  16. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  17. A new tool for man/machine integration

    International Nuclear Information System (INIS)

    Sommer, W.C.

    1981-01-01

    A popular term within the nuclear power industry today, as a result of TMI, is man/machine interface. It has been determined that greater acknowledgement of this interface is necessary within the industry to integrate the design and operational aspects of a system. What is required is an operational tool that can be used early in the engineering stages of a project and passed on later in time to those who will be responsible to operate that particular system. This paper discusses one such fundamental operations tool that is applied to a process system, its display devices, and its operator actions in a methodical fashion to integrate the machine for man's understanding and proper use. This new tool, referred to as an Operational Schematic, is shown and described. Briefly, it unites, in one location, the important operational display devices with the system process devices. A man can now see the beginning and end of each information and control loop to better understand its function within the system. A method is presented whereby in designing for operability, the schematic is utilized in three phases. The method results in two basic documents, one describes ''what'' is to be operated and the other ''how'' it is to be operated. This integration concept has now considered the hardware spectrum from sensor-to-display and operated the display (on paper) to confirm its operability. Now that the design aspects are complete, the later-in-time operational aspects need to be addressed for the man using the process system. Training personnel in operating and testing the process system is as important as the original design. To accomplish these activities, documents are prepared to instruct personnel how to operate (and test) the system under a variety of circumstances

  18. West-Life, Tools for Integrative Structural Biology

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Structural biology is part of molecular biology focusing on determining structure of macromolecules inside living cells and cell membranes. As macromolecules determines most of the functions of cells the structural knowledge is very useful for further research in metabolism, physiology to application in pharmacology etc. As macromolecules are too small to be observed directly by light microscope, there are other methods used to determine the structure including nuclear magnetic resonance (NMR), X-Ray crystalography, cryo electron microscopy and others. Each method has it's advantages and disadvantages in the terms of availability, sample preparation, resolution. West-Life project has ambition to facilitate integrative approach using multiple techniques mentioned above. As there are already lot of software tools to process data produced by the techniques above, the challenge is to integrate them together in a way they can be used by experts in one technique but not experts in other techniques. One product ...

  19. Enabling model customization and integration

    Science.gov (United States)

    Park, Minho; Fishwick, Paul A.

    2003-09-01

    Until fairly recently, the idea of dynamic model content and presentation were treated synonymously. For example, if one was to take a data flow network, which captures the dynamics of a target system in terms of the flow of data through nodal operators, then one would often standardize on rectangles and arrows for the model display. The increasing web emphasis on XML, however, suggests that the network model can have its content specified in an XML language, and then the model can be represented in a number of ways depending on the chosen style. We have developed a formal method, based on styles, that permits a model to be specified in XML and presented in 1D (text), 2D, and 3D. This method allows for customization and personalization to exert their benefits beyond e-commerce, to the area of model structures used in computer simulation. This customization leads naturally to solving the bigger problem of model integration - the act of taking models of a scene and integrating them with that scene so that there is only one unified modeling interface. This work focuses mostly on customization, but we address the integration issue in the future work section.

  20. Developing Integrated Care: Towards a development model for integrated care

    NARCIS (Netherlands)

    M.M.N. Minkman (Mirella)

    2012-01-01

    textabstractThe thesis adresses the phenomenon of integrated care. The implementation of integrated care for patients with a stroke or dementia is studied. Because a generic quality management model for integrated care is lacking, the study works towards building a development model for integrated

  1. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  2. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  3. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    Directory of Open Access Journals (Sweden)

    Huu-Tho Nguyen

    Full Text Available Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process and a fuzzy COmplex PRoportional ASsessment (COPRAS for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  4. Integration of distributed system simulation tools for a holistic approach to integrated building and system design

    NARCIS (Netherlands)

    Radosevic, M.; Hensen, J.L.M.; Wijsman, A.J.T.M.; Hensen, J.L.M.; Lain, M.

    2004-01-01

    Advanced architectural developments require an integrated approach to design where simulation tools available today deal. only with a small subset of the overall problem. The aim of this study is to enable run time exchange of necessary data at suitable frequency between different simulation

  5. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  6. Integrated Tools for Future Distributed Engine Control Technologies

    Science.gov (United States)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  7. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  8. Integrated Variable-Fidelity Tool Set for Modeling and Simulation of Aeroservothermoelasticity-Propulsion (ASTE-P) Effects for Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  9. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  10. Integrated waste management and the tool of life cycle inventory : a route to sustainable waste management

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, F.R.; White, P.R. [Procter and Gamble Newcastle Technical Centre, Newcastle (United Kingdom). Corporate Sustainable Development

    2000-07-01

    An overall approach to municipal waste management which integrates sustainable development principles was discussed. The three elements of sustainability which have to be balanced are environmental effectiveness, economic affordability and social acceptability. An integrated waste management (IWM) system considers different treatment options and deals with the entire waste stream. A life cycle inventory (LCI) and life cycle assessment (LCA) is used to determine the environmental burdens associated with IWM systems. LCIs for waste management are currently available for use in Europe, the United States, Canada and elsewhere. LCI is being used by waste management companies to assess the environmental attributes of future contract tenders. The models are used as benchmarking tools to assess the current environmental profile of a waste management system. They are also a comparative planning and communication tool. The authors are currently looking into publishing, at a future date, the experience of users of this LCI environmental management tool. 12 refs., 3 figs.

  11. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    Science.gov (United States)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  12. Competency-based evaluation tools for integrative medicine training in family medicine residency: a pilot study

    Directory of Open Access Journals (Sweden)

    Schneider Craig

    2007-04-01

    Full Text Available Abstract Background As more integrative medicine educational content is integrated into conventional family medicine teaching, the need for effective evaluation strategies grows. Through the Integrative Family Medicine program, a six site pilot program of a four year residency training model combining integrative medicine and family medicine training, we have developed and tested a set of competency-based evaluation tools to assess residents' skills in integrative medicine history-taking and treatment planning. This paper presents the results from the implementation of direct observation and treatment plan evaluation tools, as well as the results of two Objective Structured Clinical Examinations (OSCEs developed for the program. Methods The direct observation (DO and treatment plan (TP evaluation tools developed for the IFM program were implemented by faculty at each of the six sites during the PGY-4 year (n = 11 on DO and n = 8 on TP. The OSCE I was implemented first in 2005 (n = 6, revised and then implemented with a second class of IFM participants in 2006 (n = 7. OSCE II was implemented in fall 2005 with only one class of IFM participants (n = 6. Data from the initial implementation of these tools are described using descriptive statistics. Results Results from the implementation of these tools at the IFM sites suggest that we need more emphasis in our curriculum on incorporating spirituality into history-taking and treatment planning, and more training for IFM residents on effective assessment of readiness for change and strategies for delivering integrative medicine treatment recommendations. Focusing our OSCE assessment more narrowly on integrative medicine history-taking skills was much more effective in delineating strengths and weaknesses in our residents' performance than using the OSCE for both integrative and more basic communication competencies. Conclusion As these tools are refined further they will be of value both in improving

  13. Tool wear modeling using abductive networks

    Science.gov (United States)

    Masory, Oren

    1992-09-01

    A tool wear model based on Abductive Networks, which consists of a network of `polynomial' nodes, is described. The model relates the cutting parameters, components of the cutting force, and machining time to flank wear. Thus real time measurements of the cutting force can be used to monitor the machining process. The model is obtained by a training process in which the connectivity between the network's nodes and the polynomial coefficients of each node are determined by optimizing a performance criteria. Actual wear measurements of coated and uncoated carbide inserts were used for training and evaluating the established model.

  14. SWIM (Soil and Water Integrated Model)

    Energy Technology Data Exchange (ETDEWEB)

    Krysanova, V; Wechsung, F; Arnold, J; Srinivasan, R; Williams, J

    2000-12-01

    The model SWIM (Soil and Water Integrated Model) was developed in order to provide a comprehensive GIS-based tool for hydrological and water quality modelling in mesoscale and large river basins (from 100 to 10,000 km{sup 2}), which can be parameterised using regionally available information. The model was developed for the use mainly in Europe and temperate zone, though its application in other regions is possible as well. SWIM is based on two previously developed tools - SWAT and MATSALU (see more explanations in section 1.1). The model integrates hydrology, vegetation, erosion, and nutrient dynamics at the watershed scale. SWIM has a three-level disaggregation scheme 'basin - sub-basins - hydrotopes' and is coupled to the Geographic Information System GRASS (GRASS, 1993). A robust approach is suggested for the nitrogen and phosphorus modelling in mesoscale watersheds. SWIM runs under the UNIX environment. Model test and validation were performed sequentially for hydrology, crop growth, nitrogen and erosion in a number of mesoscale watersheds in the German part of the Elbe drainage basin. A comprehensive scheme of spatial disaggregation into sub-basins and hydrotopes combined with reasonable restriction on a sub-basin area allows performing the assessment of water resources and water quality with SWIM in mesoscale river basins. The modest data requirements represent an important advantage of the model. Direct connection to land use and climate data provides a possibility to use the model for analysis of climate change and land use change impacts on hydrology, agricultural production, and water quality. (orig.)

  15. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    Science.gov (United States)

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  16. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  17. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  18. Integrated model of destination competitiveness

    Directory of Open Access Journals (Sweden)

    Armenski Tanja

    2011-01-01

    Full Text Available The aim of this paper is to determine the weakest point of Serbian destination competitiveness as a tourist destination in comparation with its main competitors. The paper is organized as follows. The short introduction of the previous research on the destination competitiveness is followed by description of the Integrated model of destination competitiveness (Dwyer et al, 2003 that was used as the main reference framework. Section three is devoted to the description of the previous studies on competitiveness of Serbian tourism, while section four outlines the statistical methodology employed in this study and presents and interprets the empirical results. The results showed that Serbia is more competitive in its natural, cultural and created resources than in destination management while, according to the Integrated model, Serbia is less competitive in demand conditions that refer to the image and awareness of the destination itself.

  19. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  20. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  1. Exclusion statistics and integrable models

    International Nuclear Information System (INIS)

    Mashkevich, S.

    1998-01-01

    The definition of exclusion statistics, as given by Haldane, allows for a statistical interaction between distinguishable particles (multi-species statistics). The thermodynamic quantities for such statistics ca be evaluated exactly. The explicit expressions for the cluster coefficients are presented. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models. The interesting questions of generalizing this correspondence onto the higher-dimensional and the multi-species cases remain essentially open

  2. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    Baier, Roman; Zander, Andre

    2008-01-01

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  3. Exclusion statistics and integrable models

    International Nuclear Information System (INIS)

    Mashkevich, S.

    1998-01-01

    The definition of exclusion statistics that was given by Haldane admits a 'statistical interaction' between distinguishable particles (multispecies statistics). For such statistics, thermodynamic quantities can be evaluated exactly; explicit expressions are presented here for cluster coefficients. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models of the Calogero-Sutherland type. The interesting questions of generalizing this correspondence to the higher-dimensional and the multispecies cases remain essentially open; however, our results provide some hints as to searches for the models in question

  4. Integrated materials–structural models

    DEFF Research Database (Denmark)

    Stang, Henrik; Geiker, Mette Rica

    2008-01-01

    , repair works and strengthening methods for structures. A very significant part of the infrastructure consists of reinforced concrete structures. Even though reinforced concrete structures typically are very competitive, certain concrete structures suffer from various types of degradation. A framework...... should define a framework in which materials research results eventually should fit in and on the other side the materials research should define needs and capabilities in structural modelling. Integrated materials-structural models of a general nature are almost non-existent in the field of cement based...

  5. The integrated environmental control model

    Energy Technology Data Exchange (ETDEWEB)

    Rubin, E.S.; Berkenpas, M.B.; Kalagnanam, J.R. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-11-01

    The capability to estimate the performance and cost of emission control systems is critical to a variety of planning and analysis requirements faced by utilities, regulators, researchers and analysts in the public and private sectors. The computer model described in this paper has been developed for DOe to provide an up-to-date capability for analyzing a variety of pre-combustion, combustion, and post-combustion options in an integrated framework. A unique capability allows performance and costs to be modeled probabilistically, which allows explicit characterization of uncertainties and risks.

  6. Process Improvement Through Tool Integration in Aero-Mechanical Design

    Science.gov (United States)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  7. APMS: An Integrated Set of Tools for Measuring Safety

    Science.gov (United States)

    Statler, Irving C.; Reynard, William D. (Technical Monitor)

    1996-01-01

    statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  8. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  9. Integrated climate and hydrology modelling

    DEFF Research Database (Denmark)

    Larsen, Morten Andreas Dahl

    To ensure optimal management and sustainable strategies for water resources, infrastructures, food production and ecosystems there is a need for an improved understanding of feedback and interaction mechanisms between the atmosphere and the land surface. This is especially true in light of expected...... global warming and increased frequency of extreme events. The skill in developing projections of both the present and future climate depends essentially on the ability to numerically simulate the processes of atmospheric circulation, hydrology, energy and ecology. Previous modelling efforts of climate...... and hydrology models to more directly include the interaction between the atmosphere and the land surface. The present PhD study is motivated by an ambition of developing and applying a modelling tool capable of including the interaction and feedback mechanisms between the atmosphere and the land surface...

  10. Tools for integrating environmental objectives into policy and practice: What works where?

    Energy Technology Data Exchange (ETDEWEB)

    Runhaar, Hens

    2016-07-15

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering this question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.

  11. Tools for integrating environmental objectives into policy and practice: What works where?

    International Nuclear Information System (INIS)

    Runhaar, Hens

    2016-01-01

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering this question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.

  12. Cotangent Models for Integrable Systems

    Science.gov (United States)

    Kiesenhofer, Anna; Miranda, Eva

    2017-03-01

    We associate cotangent models to a neighbourhood of a Liouville torus in symplectic and Poisson manifolds focusing on b-Poisson/ b-symplectic manifolds. The semilocal equivalence with such models uses the corresponding action-angle theorems in these settings: the theorem of Liouville-Mineur-Arnold for symplectic manifolds and an action-angle theorem for regular Liouville tori in Poisson manifolds (Laurent- Gengoux et al., IntMath Res Notices IMRN 8: 1839-1869, 2011). Our models comprise regular Liouville tori of Poisson manifolds but also consider the Liouville tori on the singular locus of a b-Poisson manifold. For this latter class of Poisson structures we define a twisted cotangent model. The equivalence with this twisted cotangent model is given by an action-angle theorem recently proved by the authors and Scott (Math. Pures Appl. (9) 105(1):66-85, 2016). This viewpoint of cotangent models provides a new machinery to construct examples of integrable systems, which are especially valuable in the b-symplectic case where not many sources of examples are known. At the end of the paper we introduce non-degenerate singularities as lifted cotangent models on b-symplectic manifolds and discuss some generalizations of these models to general Poisson manifolds.

  13. A vacuum microgripping tool with integrated vibration releasing capability

    Energy Technology Data Exchange (ETDEWEB)

    Rong, Weibin; Fan, Zenghua, E-mail: zenghua-fan@163.com; Wang, Lefeng; Xie, Hui; Sun, Lining [State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, Heilongjiang (China)

    2014-08-01

    Pick-and-place of micro-objects is a basic task in various micromanipulation demands. Reliable releasing of micro-objects is usually disturbed due to strong scale effects. This paper focuses on a vacuum micro-gripper with vibration releasing functionality, which was designed and assembled for reliable micromanipulation tasks. Accordingly, a vibration releasing strategy of implementing a piezoelectric actuator on the vacuum microgripping tool is presented to address the releasing problem. The releasing mechanism was illustrated using a dynamic micro contact model. This model was developed via theoretical analysis, simulations and pull-off force measurement using atomic force microscopy. Micromanipulation experiments were conducted to verify the performance of the vacuum micro-gripper. The results show that, with the assistance of the vibration releasing, the vacuum microgripping tool can achieve reliable release of micro-objects. A releasing location accuracy of 4.5±0.5 μm and a successful releasing rate of around 100% (which is based on 110 trials) were achieved for manipulating polystyrene microspheres with radius of 35–100 μm.

  14. A vacuum microgripping tool with integrated vibration releasing capability

    International Nuclear Information System (INIS)

    Rong, Weibin; Fan, Zenghua; Wang, Lefeng; Xie, Hui; Sun, Lining

    2014-01-01

    Pick-and-place of micro-objects is a basic task in various micromanipulation demands. Reliable releasing of micro-objects is usually disturbed due to strong scale effects. This paper focuses on a vacuum micro-gripper with vibration releasing functionality, which was designed and assembled for reliable micromanipulation tasks. Accordingly, a vibration releasing strategy of implementing a piezoelectric actuator on the vacuum microgripping tool is presented to address the releasing problem. The releasing mechanism was illustrated using a dynamic micro contact model. This model was developed via theoretical analysis, simulations and pull-off force measurement using atomic force microscopy. Micromanipulation experiments were conducted to verify the performance of the vacuum micro-gripper. The results show that, with the assistance of the vibration releasing, the vacuum microgripping tool can achieve reliable release of micro-objects. A releasing location accuracy of 4.5±0.5 μm and a successful releasing rate of around 100% (which is based on 110 trials) were achieved for manipulating polystyrene microspheres with radius of 35–100 μm

  15. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  16. Tool life and surface integrity aspects when drilling nickel alloy

    Science.gov (United States)

    Kannan, S.; Pervaiz, S.; Vincent, S.; Karthikeyan, R.

    2018-04-01

    . Overall the results indicate that the effect of drilling and milling parameters is most marked in terms of surface quality in the circumferential direction. Material removal rates and tool flank wear must be maintained within the control limits to maintain hole integrity.

  17. Vertically Integrated Models for Carbon Storage Modeling in Heterogeneous Domains

    Science.gov (United States)

    Bandilla, K.; Celia, M. A.

    2017-12-01

    Numerical modeling is an essential tool for studying the impacts of geologic carbon storage (GCS). Injection of carbon dioxide (CO2) into deep saline aquifers leads to multi-phase flow (injected CO2 and resident brine), which can be described by a set of three-dimensional governing equations, including mass-balance equation, volumetric flux equations (modified Darcy), and constitutive equations. This is the modeling approach on which commonly used reservoir simulators such as TOUGH2 are based. Due to the large density difference between CO2 and brine, GCS models can often be simplified by assuming buoyant segregation and integrating the three-dimensional governing equations in the vertical direction. The integration leads to a set of two-dimensional equations coupled with reconstruction operators for vertical profiles of saturation and pressure. Vertically-integrated approaches have been shown to give results of comparable quality as three-dimensional reservoir simulators when applied to realistic CO2 injection sites such as the upper sand wedge at the Sleipner site. However, vertically-integrated approaches usually rely on homogeneous properties over the thickness of a geologic layer. Here, we investigate the impact of general (vertical and horizontal) heterogeneity in intrinsic permeability, relative permeability functions, and capillary pressure functions. We consider formations involving complex fluvial deposition environments and compare the performance of vertically-integrated models to full three-dimensional models for a set of hypothetical test cases consisting of high permeability channels (streams) embedded in a low permeability background (floodplains). The domains are randomly generated assuming that stream channels can be represented by sinusoidal waves in the plan-view and by parabolas for the streams' cross-sections. Stream parameters such as width, thickness and wavelength are based on values found at the Ketzin site in Germany. Results from the

  18. Integrating neuroinformatics tools in TheVirtualBrain.

    Science.gov (United States)

    Woodman, M Marmaduke; Pezard, Laurent; Domide, Lia; Knock, Stuart A; Sanz-Leon, Paula; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor

    2014-01-01

    TheVirtualBrain (TVB) is a neuroinformatics Python package representing the convergence of clinical, systems, and theoretical neuroscience in the analysis, visualization and modeling of neural and neuroimaging dynamics. TVB is composed of a flexible simulator for neural dynamics measured across scales from local populations to large-scale dynamics measured by electroencephalography (EEG), magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI), and core analytic and visualization functions, all accessible through a web browser user interface. A datatype system modeling neuroscientific data ties together these pieces with persistent data storage, based on a combination of SQL and HDF5. These datatypes combine with adapters allowing TVB to integrate other algorithms or computational systems. TVB provides infrastructure for multiple projects and multiple users, possibly participating under multiple roles. For example, a clinician might import patient data to identify several potential lesion points in the patient's connectome. A modeler, working on the same project, tests these points for viability through whole brain simulation, based on the patient's connectome, and subsequent analysis of dynamical features. TVB also drives research forward: the simulator itself represents the culmination of several simulation frameworks in the modeling literature. The availability of the numerical methods, set of neural mass models and forward solutions allows for the construction of a wide range of brain-scale simulation scenarios. This paper briefly outlines the history and motivation for TVB, describing the framework and simulator, giving usage examples in the web UI and Python scripting.

  19. Integrating neuroinformatics tools in TheVirtualBrain

    Directory of Open Access Journals (Sweden)

    M Marmaduke Woodman

    2014-04-01

    Full Text Available TheVirtualBrain (TVB is a neuroinformatics Python package representing theconvergence of clinical, systems, and theoretical neuroscience in the analysis,visualization and modeling of neural and neuroimaging dynamics. TVB iscomposed of a flexible simulator for neural dynamics measured across scalesfrom local populations to large-scale dynamics measured byelectroencephalography (EEG, magnetoencephalography (MEG and functionalmagnetic resonance imaging (fMRI, and core analytic and visualizationfunctions, all accessible through a web browser user interface. A datatypesystem modeling neuroscientific data ties together these pieces with persistentdata storage, based on a combination of SQL & HDF5. These datatypes combinewith adapters allowing TVB to integrate other algorithms or computationalsystems. TVB provides infrastructure for multiple projects and multiple users,possibly participating under multiple roles. For example, a clinician mightimport patient data to identify several potential lesion points in thepatient's connectome. A modeler, working on the same project, tests thesepoints for viability through whole brain simulation, based on the patient'sconnectome, and subsequent analysis of dynamical features. TVB also drivesresearch forward: the simulator itself represents the culmination of severalsimulation frameworks in the modeling literature. The availability of thenumerical methods, set of neural mass models and forward solutions allows forthe construction of a wide range of brain-scale simulation scenarios. Thispaper briefly outlines the history and motivation for TVB, describing theframework and simulator, giving usage examples in the web UI and Pythonscripting.

  20. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  1. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  2. KAIKObase: An integrated silkworm genome database and data mining tool

    Directory of Open Access Journals (Sweden)

    Nagaraju Javaregowda

    2009-10-01

    Full Text Available Abstract Background The silkworm, Bombyx mori, is one of the most economically important insects in many developing countries owing to its large-scale cultivation for silk production. With the development of genomic and biotechnological tools, B. mori has also become an important bioreactor for production of various recombinant proteins of biomedical interest. In 2004, two genome sequencing projects for B. mori were reported independently by Chinese and Japanese teams; however, the datasets were insufficient for building long genomic scaffolds which are essential for unambiguous annotation of the genome. Now, both the datasets have been merged and assembled through a joint collaboration between the two groups. Description Integration of the two data sets of silkworm whole-genome-shotgun sequencing by the Japanese and Chinese groups together with newly obtained fosmid- and BAC-end sequences produced the best continuity (~3.7 Mb in N50 scaffold size among the sequenced insect genomes and provided a high degree of nucleotide coverage (88% of all 28 chromosomes. In addition, a physical map of BAC contigs constructed by fingerprinting BAC clones and a SNP linkage map constructed using BAC-end sequences were available. In parallel, proteomic data from two-dimensional polyacrylamide gel electrophoresis in various tissues and developmental stages were compiled into a silkworm proteome database. Finally, a Bombyx trap database was constructed for documenting insertion positions and expression data of transposon insertion lines. Conclusion For efficient usage of genome information for functional studies, genomic sequences, physical and genetic map information and EST data were compiled into KAIKObase, an integrated silkworm genome database which consists of 4 map viewers, a gene viewer, and sequence, keyword and position search systems to display results and data at the level of nucleotide sequence, gene, scaffold and chromosome. Integration of the

  3. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  4. Adaptation in integrated assessment modeling: where do we stand?

    OpenAIRE

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We analyze how modelers have chosen to describe adaptation within an integrated framework, and suggest many ways they could improve the treatment of adaptation by considering more of its bottom-up cha...

  5. Integrating environmental component models. Development of a software framework

    NARCIS (Netherlands)

    Schmitz, O.

    2014-01-01

    Integrated models consist of interacting component models that represent various natural and social systems. They are important tools to improve our understanding of environmental systems, to evaluate cause–effect relationships of human–natural interactions, and to forecast the behaviour of

  6. Integration of design applications with building models

    DEFF Research Database (Denmark)

    Eastman, C. M.; Jeng, T. S.; Chowdbury, R.

    1997-01-01

    This paper reviews various issues in the integration of applications with a building model... (Truncated.)......This paper reviews various issues in the integration of applications with a building model... (Truncated.)...

  7. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  8. Qualitative Analysis of Integration Adapter Modeling

    OpenAIRE

    Ritter, Daniel; Holzleitner, Manuel

    2015-01-01

    Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

  9. Assess the flood resilience tools integration in the landuse projects

    Science.gov (United States)

    Moulin, E.; Deroubaix, J.-F.

    2012-04-01

    Despite a severe regulation concerning the building in flooding areas, 80% of these areas are already built in the Greater Paris (Paris, Val-de-Marne, Hauts-de-Seine and Seine-Saint-Denis). The land use in flooding area is presented as one of the main solutions to solve the ongoing real estate pressure. For instance some of the industrial wastelands located along the river are currently in redevelopment and residential buildings are planned. So the landuse in the flooding areas is currently a key issue in the development of the Greater Paris area. To deal with floods there are some resilience tools, whether structural (such as perimeter barriers or building aperture barriers, etc) or non structural (such as warning systems, etc.). The technical solutions are available and most of the time efficient1. Still, we notice that these tools are not much implemented. The people; stakeholders and inhabitants, literally seems to be not interested. This papers focus on the integration of resilience tools in urban projects. Indeed one of the blockages in the implementation of an efficient flood risk prevention policy is the lack of concern of the landuse stakeholders and the inhabitants for the risk2. We conducted an important number of interviews with stakeholders involved in various urban projects and we assess, in this communication, to what extent the improvement of the resilience to floods is considered as a main issue in the execution of an urban project? How this concern is maintained or could be maintained throughout the project. Is there a dilution of this concern? In order to develop this topic we rely on a case study. The "Ardoines" is a project aiming at redeveloping an industrial site (South-East Paris), into a project including residential and office buildings and other amenities. In order to elaborate the master plan, the urban planning authority brought together some flood risk experts. According to the comments of the experts, the architect in charge of the

  10. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  11. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  12. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    1998-01-01

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  13. Climbing the ladder: capability maturity model integration level 3

    Science.gov (United States)

    Day, Bryce; Lutteroth, Christof

    2011-02-01

    This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.

  14. Surgical Technology Integration with Tools for Cognitive Human Factors (STITCH)

    Science.gov (United States)

    2010-10-01

    Measurement Tool We conducted another round of data collection using the daVinci Surgical System at the University of Kentucky Hospital in May. In this...9 3. Tools and Display Technology...considering cognitive and environmental factors such as mental workload, stress, situation awareness, and level of comfort with complex tools . To

  15. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    Directory of Open Access Journals (Sweden)

    Esther Suter

    2017-11-01

    Full Text Available Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework.

  16. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    Science.gov (United States)

    Oelke, Nelly D.; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana

    2017-01-01

    Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework. PMID:29588637

  17. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  18. An integrated simulation tool for analyzing the Operation and Interdependency of Natural Gas and Electric Power Systems

    OpenAIRE

    PAMBOUR Kwabena A.; CAKIR BURCIN; BOLADO LAVIN Ricardo; DIJKEMA Gerard

    2016-01-01

    In this paper, we present an integrated simulation tool for analyzing the interdependency of natural gas and electric power systems in terms of security of energy supply. In the first part, we develop mathematical models for the individual systems. In part two, we identify the interconnections between both systems and propose a method for coupling the combined simulation model. Next, we develop the algorithm for solving the combined system and integrate this algorithm into a simulation softwa...

  19. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  20. Wave and Wind Model Performance Metrics Tools

    Science.gov (United States)

    Choi, J. K.; Wang, D. W.

    2016-02-01

    Continual improvements and upgrades of Navy ocean wave and wind models are essential to the assurance of battlespace environment predictability of ocean surface wave and surf conditions in support of Naval global operations. Thus, constant verification and validation of model performance is equally essential to assure the progress of model developments and maintain confidence in the predictions. Global and regional scale model evaluations may require large areas and long periods of time. For observational data to compare against, altimeter winds and waves along the tracks from past and current operational satellites as well as moored/drifting buoys can be used for global and regional coverage. Using data and model runs in previous trials such as the planned experiment, the Dynamics of the Adriatic in Real Time (DART), we demonstrated the use of accumulated altimeter wind and wave data over several years to obtain an objective evaluation of the performance the SWAN (Simulating Waves Nearshore) model running in the Adriatic Sea. The assessment provided detailed performance of wind and wave models by using cell-averaged statistical variables maps with spatial statistics including slope, correlation, and scatter index to summarize model performance. Such a methodology is easily generalized to other regions and at global scales. Operational technology currently used by subject matter experts evaluating the Navy Coastal Ocean Model and the Hybrid Coordinate Ocean Model can be expanded to evaluate wave and wind models using tools developed for ArcMAP, a GIS application developed by ESRI. Recent inclusion of altimeter and buoy data into a format through the Naval Oceanographic Office's (NAVOCEANO) quality control system and the netCDF standards applicable to all model output makes it possible for the fusion of these data and direct model verification. Also, procedures were developed for the accumulation of match-ups of modelled and observed parameters to form a data base

  1. An integrated development environment for PMESII model authoring, integration, validation, and debugging

    Science.gov (United States)

    Pioch, Nicholas J.; Lofdahl, Corey; Sao Pedro, Michael; Krikeles, Basil; Morley, Liam

    2007-04-01

    To foster shared battlespace awareness in Air Operations Centers supporting the Joint Forces Commander and Joint Force Air Component Commander, BAE Systems is developing a Commander's Model Integration and Simulation Toolkit (CMIST), an Integrated Development Environment (IDE) for model authoring, integration, validation, and debugging. CMIST is built on the versatile Eclipse framework, a widely used open development platform comprised of extensible frameworks that enable development of tools for building, deploying, and managing software. CMIST provides two distinct layers: 1) a Commander's IDE for supporting staff to author models spanning the Political, Military, Economic, Social, Infrastructure, Information (PMESII) taxonomy; integrate multiple native (third-party) models; validate model interfaces and outputs; and debug the integrated models via intuitive controls and time series visualization, and 2) a PMESII IDE for modeling and simulation developers to rapidly incorporate new native simulation tools and models to make them available for use in the Commander's IDE. The PMESII IDE provides shared ontologies and repositories for world state, modeling concepts, and native tool characterization. CMIST includes extensible libraries for 1) reusable data transforms for semantic alignment of native data with the shared ontology, and 2) interaction patterns to synchronize multiple native simulations with disparate modeling paradigms, such as continuous-time system dynamics, agent-based discrete event simulation, and aggregate solution methods such as Monte Carlo sampling over dynamic Bayesian networks. This paper describes the CMIST system architecture, our technical approach to addressing these semantic alignment and synchronization problems, and initial results from integrating Political-Military-Economic models of post-war Iraq spanning multiple modeling paradigms.

  2. INTEGRATED SPEED ESTIMATION MODEL FOR MULTILANE EXPREESSWAYS

    Science.gov (United States)

    Hong, Sungjoon; Oguchi, Takashi

    In this paper, an integrated speed-estimation model is developed based on empirical analyses for the basic sections of intercity multilane expressway un der the uncongested condition. This model enables a speed estimation for each lane at any site under arb itrary highway-alignment, traffic (traffic flow and truck percentage), and rainfall conditions. By combin ing this model and a lane-use model which estimates traffic distribution on the lanes by each vehicle type, it is also possible to es timate an average speed across all the lanes of one direction from a traffic demand by vehicle type under specific highway-alignment and rainfall conditions. This model is exp ected to be a tool for the evaluation of traffic performance for expressways when the performance me asure is travel speed, which is necessary for Performance-Oriented Highway Planning and Design. Regarding the highway-alignment condition, two new estimators, called effective horizo ntal curvature and effective vertical grade, are proposed in this paper which take into account the influence of upstream and downstream alignment conditions. They are applied to the speed-estimation model, and it shows increased accuracy of the estimation.

  3. Integrated Modeling of Complex Optomechanical Systems

    Science.gov (United States)

    Andersen, Torben; Enmark, Anita

    2011-09-01

    Mathematical modeling and performance simulation are playing an increasing role in large, high-technology projects. There are two reasons; first, projects are now larger than they were before, and the high cost calls for detailed performance prediction before construction. Second, in particular for space-related designs, it is often difficult to test systems under realistic conditions beforehand, and mathematical modeling is then needed to verify in advance that a system will work as planned. Computers have become much more powerful, permitting calculations that were not possible before. At the same time mathematical tools have been further developed and found acceptance in the community. Particular progress has been made in the fields of structural mechanics, optics and control engineering, where new methods have gained importance over the last few decades. Also, methods for combining optical, structural and control system models into global models have found widespread use. Such combined models are usually called integrated models and were the subject of this symposium. The objective was to bring together people working in the fields of groundbased optical telescopes, ground-based radio telescopes, and space telescopes. We succeeded in doing so and had 39 interesting presentations and many fruitful discussions during coffee and lunch breaks and social arrangements. We are grateful that so many top ranked specialists found their way to Kiruna and we believe that these proceedings will prove valuable during much future work.

  4. Development of an integrated cost model for nuclear plant decommissioning

    International Nuclear Information System (INIS)

    Amos, G.; Roy, R.

    2003-01-01

    A need for an integrated cost estimating tool for nuclear decommissioning and associated waste processing and storage facilities for Intermediate Level Waste (ILW) was defined during the authors recent MSc studies. In order to close the defined gap a prototype tool was developed using logically derived CER's and cost driver variables. The challenge in developing this was to be able to produce a model that could produce realistic cost estimates from the limited levels of historic cost data that was available for analysis. The model is an excel based tool supported by 3 point risk estimating output and is suitable for producing estimates for strategic or optional cost estimates (±30%) early in the conceptual stage of a decommissioning project. The model was validated using minimal numbers of case studies supported by expert opinion discussion. The model provides an enhanced approach for integrated decommissioning estimates which will be produced concurrently with strategic options analysis on a nuclear site

  5. Integrable models of quantum optics

    Directory of Open Access Journals (Sweden)

    Yudson Vladimir

    2017-01-01

    Full Text Available We give an overview of exactly solvable many-body models of quantum optics. Among them is a system of two-level atoms which interact with photons propagating in a one-dimensional (1D chiral waveguide; exact eigenstates of this system can be explicitly constructed. This approach is used also for a system of closely located atoms in the usual (non-chiral waveguide or in 3D space. Moreover, it is shown that for an arbitrary atomic system with a cascade spontaneous radiative decay, the fluorescence spectrum can be described by an exact analytic expression which accounts for interference of emitted photons. Open questions related with broken integrability are discussed.

  6. Rapid HIS, RIS, PACS Integration Using Graphical CASE Tools

    Science.gov (United States)

    Taira, Ricky K.; Breant, Claudine M.; Stepczyk, Frank M.; Kho, Hwa T.; Valentino, Daniel J.; Tashima, Gregory H.; Materna, Anthony T.

    1994-05-01

    We describe the clinical requirements of the integrated federation of databases and present our client-mediator-server design. The main body of the paper describes five important aspects of integrating information systems: (1) global schema design, (2) establishing sessions with remote database servers, (3) development of schema translators, (4) integration of global system triggers, and (5) development of job workflow scripts.

  7. Topological quantum theories and integrable models

    International Nuclear Information System (INIS)

    Keski-Vakkuri, E.; Niemi, A.J.; Semenoff, G.; Tirkkonen, O.

    1991-01-01

    The path-integral generalization of the Duistermaat-Heckman integration formula is investigated for integrable models. It is shown that for models with periodic classical trajectories the path integral reduces to a form similar to the finite-dimensional Duistermaat-Heckman integration formula. This provides a relation between exactness of the stationary-phase approximation and Morse theory. It is also argued that certain integrable models can be related to topological quantum theories. Finally, it is found that in general the stationary-phase approximation presumes that the initial and final configurations are in different polarizations. This is exemplified by the quantization of the SU(2) coadjoint orbit

  8. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  9. Freiburg RNA Tools: a web server integrating INTARNA, EXPARNA and LOCARNA.

    Science.gov (United States)

    Smith, Cameron; Heyne, Steffen; Richter, Andreas S; Will, Sebastian; Backofen, Rolf

    2010-07-01

    The Freiburg RNA tools web server integrates three tools for the advanced analysis of RNA in a common web-based user interface. The tools IntaRNA, ExpaRNA and LocARNA support the prediction of RNA-RNA interaction, exact RNA matching and alignment of RNA, respectively. The Freiburg RNA tools web server and the software packages of the stand-alone tools are freely accessible at http://rna.informatik.uni-freiburg.de.

  10. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  11. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  12. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  13. Metabolic engineering tools in model cyanobacteria.

    Science.gov (United States)

    Carroll, Austin L; Case, Anna E; Zhang, Angela; Atsumi, Shota

    2018-03-26

    Developing sustainable routes for producing chemicals and fuels is one of the most important challenges in metabolic engineering. Photoautotrophic hosts are particularly attractive because of their potential to utilize light as an energy source and CO 2 as a carbon substrate through photosynthesis. Cyanobacteria are unicellular organisms capable of photosynthesis and CO 2 fixation. While engineering in heterotrophs, such as Escherichia coli, has result in a plethora of tools for strain development and hosts capable of producing valuable chemicals efficiently, these techniques are not always directly transferable to cyanobacteria. However, recent efforts have led to an increase in the scope and scale of chemicals that cyanobacteria can produce. Adaptations of important metabolic engineering tools have also been optimized to function in photoautotrophic hosts, which include Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)-Cas9, 13 C Metabolic Flux Analysis (MFA), and Genome-Scale Modeling (GSM). This review explores innovations in cyanobacterial metabolic engineering, and highlights how photoautotrophic metabolism has shaped their development. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  14. Testing periodically integrated autoregressive models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)

    1997-01-01

    textabstractPeriodically integrated time series require a periodic differencing filter to remove the stochastic trend. A non-periodic integrated time series needs the first-difference filter for similar reasons. When the changing seasonal fluctuations for the non-periodic integrated series can be

  15. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    Science.gov (United States)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  16. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    International Nuclear Information System (INIS)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-01-01

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  17. Techniques to Access Databases and Integrate Data for Hydrologic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, Gene; Tenney, Nathan D.; Pelton, Mitchell A.; Coleman, Andre M.; Ward, Duane L.; Droppo, James G.; Meyer, Philip D.; Dorow, Kevin E.; Taira, Randal Y.

    2009-06-17

    This document addresses techniques to access and integrate data for defining site-specific conditions and behaviors associated with ground-water and surface-water radionuclide transport applicable to U.S. Nuclear Regulatory Commission reviews. Environmental models typically require input data from multiple internal and external sources that may include, but are not limited to, stream and rainfall gage data, meteorological data, hydrogeological data, habitat data, and biological data. These data may be retrieved from a variety of organizations (e.g., federal, state, and regional) and source types (e.g., HTTP, FTP, and databases). Available data sources relevant to hydrologic analyses for reactor licensing are identified and reviewed. The data sources described can be useful to define model inputs and parameters, including site features (e.g., watershed boundaries, stream locations, reservoirs, site topography), site properties (e.g., surface conditions, subsurface hydraulic properties, water quality), and site boundary conditions, input forcings, and extreme events (e.g., stream discharge, lake levels, precipitation, recharge, flood and drought characteristics). Available software tools for accessing established databases, retrieving the data, and integrating it with models were identified and reviewed. The emphasis in this review was on existing software products with minimal required modifications to enable their use with the FRAMES modeling framework. The ability of four of these tools to access and retrieve the identified data sources was reviewed. These four software tools were the Hydrologic Data Acquisition and Processing System (HDAPS), Integrated Water Resources Modeling System (IWRMS) External Data Harvester, Data for Environmental Modeling Environmental Data Download Tool (D4EM EDDT), and the FRAMES Internet Database Tools. The IWRMS External Data Harvester and the D4EM EDDT were identified as the most promising tools based on their ability to access and

  18. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    Science.gov (United States)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  19. Enabling Integrated Decision Making for Electronic-Commerce by Modelling an Enterprise's Sharable Knowledge.

    Science.gov (United States)

    Kim, Henry M.

    2000-01-01

    An enterprise model, a computational model of knowledge about an enterprise, is a useful tool for integrated decision-making by e-commerce suppliers and customers. Sharable knowledge, once represented in an enterprise model, can be integrated by the modeled enterprise's e-commerce partners. Presents background on enterprise modeling, followed by…

  20. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  1. A model for integrated dictionaries of fixed expressions

    DEFF Research Database (Denmark)

    Bergenholtz, Henning; Bothma, Theo; Gouws, Rufus

    2011-01-01

    This paper discusses a project for the creation of a theoretical model for integrated e-dictionaries, illustrated by means of an e-information tool for the presentation and treatment of fixed expressions using Afrikaans as example language. To achieve this a database of fixed expressions...

  2. Adaptation in integrated assessment modeling: where do we stand?

    NARCIS (Netherlands)

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We

  3. An integrated environment for developing object-oriented CAE tools

    Energy Technology Data Exchange (ETDEWEB)

    Hofmann, P.; Ryba, M.; Baitinger, U.G. [Integrated System Engeneering, Stuttgart (Germany)

    1996-12-31

    This paper presents how object oriented techniques can applied to improve the development of CAE tools. For the design of modular and reusable software systems we use predefined and well tested building blocks. These building blocks are reusable software components based on object-oriented technology which allows the assembling of software systems. Today CAE tools are typically very complex and computation extensive. Therefore we need a concept, that join the advantages of the object-oriented paradigm with the advantages of parallel and distributed programming. So we present a design environment for the development of concurrent-object oriented CAE tools called CoDO.

  4. Teaching Students How to Integrate and Assess Social Networking Tools in Marketing Communications

    Science.gov (United States)

    Schlee, Regina Pefanis; Harich, Katrin R.

    2013-01-01

    This research is based on two studies that focus on teaching students how to integrate and assess social networking tools in marketing communications. Study 1 examines how students in marketing classes utilize social networking tools and explores their attitudes regarding the use of such tools for marketing communications. Study 2 focuses on an…

  5. A review of computer tools for analysing the integration of renewable energy into various energy systems

    DEFF Research Database (Denmark)

    Connolly, D.; Lund, Henrik; Mathiesen, Brian Vad

    2010-01-01

    to integrating renewable energy, but instead the ‘ideal’ energy tool is highly dependent on the specific objectives that must be fulfilled. The typical applications for the 37 tools reviewed (from analysing single-building systems to national energy-systems), combined with numerous other factors......This paper includes a review of the different computer tools that can be used to analyse the integration of renewable energy. Initially 68 tools were considered, but 37 were included in the final analysis which was carried out in collaboration with the tool developers or recommended points...... of contact. The results in this paper provide the information necessary to identify a suitable energy tool for analysing the integration of renewable energy into various energy-systems under different objectives. It is evident from this paper that there is no energy tool that addresses all issues related...

  6. Multidisciplinary Modelling Tools for Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad

    in reliability assessment of power modules, a three-dimensional lumped thermal network is proposed to be used for fast, accurate and detailed temperature estimation of power module in dynamic operation and different boundary conditions. Since an important issue in the reliability of power electronics...... environment to be used for optimization of cooling system layout with respect to thermal resistance and pressure drop reductions. Finally extraction of electrical parasitics in the multi-chip power modules will be investigated. As the switching frequency of power devices increases, the size of passive...... components are reduced considerably that leads to increase of power density and cost reduction. However, electrical parasitics become more challenging with increasing the switching frequency and paralleled chips in the integrated and denser packages. Therefore, electrical parasitic models are analyzed based...

  7. Integrating Technology Tools for Students Struggling with Written Language

    Science.gov (United States)

    Fedora, Pledger

    2015-01-01

    This exploratory study was designed to assess the experience of preservice teachers when integrating written language technology and their likelihood of applying that technology in their future classrooms. Results suggest that after experiencing technology integration, preservice teachers are more likely to use it in their future teaching.

  8. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    Science.gov (United States)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  9. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    International Nuclear Information System (INIS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-01-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools

  10. Integrable models in classical and quantum mechanics

    International Nuclear Information System (INIS)

    Jurco, B.

    1991-01-01

    Integrable systems are investigated, especially the rational and trigonometric Gaudin models. The Gaudin models are diagonalized for the case of classical Lie algebras. Their relation to the other integrable models and to the quantum inverse scattering method is investigated. Applications in quantum optics and plasma physics are discussed. (author). 94 refs

  11. Innovative R.E.A. tools for integrated bathymetric survey

    Science.gov (United States)

    Demarte, Maurizio; Ivaldi, Roberta; Sinapi, Luigi; Bruzzone, Gabriele; Caccia, Massimo; Odetti, Angelo; Fontanelli, Giacomo; Masini, Andrea; Simeone, Emilio

    2017-04-01

    The REA (Rapid Environmental Assessment) concept is a methodology finalized to acquire environmental information, process them and return in standard paper-chart or standard digital format. Acquired data become thus available for the ingestion or the valorization of the Civilian Protection Emergency Organization or the Rapid Response Forces. The use of Remotely Piloted Aircraft Systems (RPAS) with the miniaturization of multispectral camera or Hyperspectral camera gives to the operator the capability to react in a short time jointly with the capacity to collect a big amount of different data and to deliver a very large number of products. The proposed methodology incorporates data collected from remote and autonomous sensors that acquire data over areas in a cost-effective manner. The hyperspectral sensors are able to map seafloor morphology, seabed structure, depth of bottom surface and an estimate of sediment development. The considerable spectral portions are selected using an appropriate configuration of hyperspectral cameras to maximize the spectral resolution. Data acquired by hyperspectral camera are geo-referenced synchronously to an Attitude and Heading Reference Systems (AHRS) sensor. The data can be subjected to a first step on-board processing of the unmanned vehicle before be transferred through the Ground Control Station (GCS) to a Processing Exploitation Dissemination (PED) system. The recent introduction of Data Distribution Systems (DDS) capabilities in PED allow a cooperative distributed approach to modern decision making. Two platforms are used in our project, a Remote Piloted Aircraft (RPAS) and an Unmanned Surface Vehicle (USV). The two platforms mutually interact to cover a surveyed area wider than the ones that could be covered by the single vehicles. The USV, especially designed to work in very shallow water, has a modular structure and an open hardware and software architecture allowing for an easy installation and integration of various

  12. Modelling Spark Integration in Science Classroom

    Directory of Open Access Journals (Sweden)

    Marie Paz E. Morales

    2014-02-01

    Full Text Available The study critically explored how a PASCO-designed technology (SPARK ScienceLearning System is meaningfully integrated into the teaching of selected topics in Earth and Environmental Science. It highlights on modelling the effectiveness of using the SPARK Learning System as a primary tool in learning science that leads to learning and achievement of the students. Data and observation gathered and correlation of the ability of the technology to develop high intrinsic motivation to student achievement were used to design framework on how to meaningfully integrate SPARK ScienceLearning System in teaching Earth and Environmental Science. Research instruments used in this study were adopted from standardized questionnaires available from literature. Achievement test and evaluation form were developed and validated for the purpose of deducing data needed for the study. Interviews were done to delve into the deeper thoughts and emotions of the respondents. Data from the interviews served to validate all numerical data culled from this study. Cross-case analysis of the data was done to reveal some recurring themes, problems and benefits derived by the students in using the SPARK Science Learning System to further establish its effectiveness in the curriculum as a forerunner to the shift towards the 21st Century Learning.

  13. Integrative change model in psychotherapy: Perspectives from Indian thought.

    Science.gov (United States)

    Manickam, L S S

    2013-01-01

    Different psychotherapeutic approaches claim positive changes in patients as a result of therapy. Explanations related to the change process led to different change models. Some of the change models are experimentally oriented whereas some are theoretical. Apart from the core models of behavioral, psychodynamic, humanistic, cognitive and spiritually oriented models there are specific models, within psychotherapy that explains the change process. Integrative theory of a person as depicted in Indian thought provides a common ground for the integration of various therapies. Integrative model of change based on Indian thought, with specific reference to psychological concepts in Upanishads, Ayurveda, Bhagavad Gita and Yoga are presented. Appropriate psychological tools may be developed in order to help the clinicians to choose the techniques that match the problem and the origin of the dimension. Explorations have to be conducted to develop more techniques that are culturally appropriate and clinically useful. Research has to be initiated to validate the identified concepts.

  14. Theory, modeling, and integrated studies in the Arase (ERG) project

    Science.gov (United States)

    Seki, Kanako; Miyoshi, Yoshizumi; Ebihara, Yusuke; Katoh, Yuto; Amano, Takanobu; Saito, Shinji; Shoji, Masafumi; Nakamizo, Aoi; Keika, Kunihiro; Hori, Tomoaki; Nakano, Shin'ya; Watanabe, Shigeto; Kamiya, Kei; Takahashi, Naoko; Omura, Yoshiharu; Nose, Masahito; Fok, Mei-Ching; Tanaka, Takashi; Ieda, Akimasa; Yoshikawa, Akimasa

    2018-02-01

    Understanding of underlying mechanisms of drastic variations of the near-Earth space (geospace) is one of the current focuses of the magnetospheric physics. The science target of the geospace research project Exploration of energization and Radiation in Geospace (ERG) is to understand the geospace variations with a focus on the relativistic electron acceleration and loss processes. In order to achieve the goal, the ERG project consists of the three parts: the Arase (ERG) satellite, ground-based observations, and theory/modeling/integrated studies. The role of theory/modeling/integrated studies part is to promote relevant theoretical and simulation studies as well as integrated data analysis to combine different kinds of observations and modeling. Here we provide technical reports on simulation and empirical models related to the ERG project together with their roles in the integrated studies of dynamic geospace variations. The simulation and empirical models covered include the radial diffusion model of the radiation belt electrons, GEMSIS-RB and RBW models, CIMI model with global MHD simulation REPPU, GEMSIS-RC model, plasmasphere thermosphere model, self-consistent wave-particle interaction simulations (electron hybrid code and ion hybrid code), the ionospheric electric potential (GEMSIS-POT) model, and SuperDARN electric field models with data assimilation. ERG (Arase) science center tools to support integrated studies with various kinds of data are also briefly introduced.[Figure not available: see fulltext.

  15. Advanced Manufacturing Technologies (AMT): Composites Integrated Modeling

    Data.gov (United States)

    National Aeronautics and Space Administration — The Composites Integrated Modeling (CIM) Element developed low cost, lightweight, and efficient composite structures, materials and manufacturing technologies with...

  16. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  17. State-of-the-art Review : Vol. 2B. Methods and Tools for Designing Integrated Building Concepts

    DEFF Research Database (Denmark)

    van der Aa, Ad; Andresen, Inger; Asada, Hideo

    of integrated building concepts and responsive building elements. At last, the report gives a description of uncertainty modelling in building performance assessment. The descriptions of the design methods and tools include an explanation of how the methods may be applied, any experiences gained by using...

  18. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...

  19. Modeling integrated biomass gasification business concepts

    Science.gov (United States)

    Peter J. Ince; Ted Bilek; Mark A. Dietenberger

    2011-01-01

    Biomass gasification is an approach to producing energy and/or biofuels that could be integrated into existing forest product production facilities, particularly at pulp mills. Existing process heat and power loads tend to favor integration at existing pulp mills. This paper describes a generic modeling system for evaluating integrated biomass gasification business...

  20. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  1. Effect of different machining processes on the tool surface integrity and fatigue life

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Chuan Liang [College of Mechanical and Electrical Engineering, Nanchang University, Nanchang (China); Zhang, Xianglin [School of Materials Science and Engineering, Huazhong University of Science and Technology, Wuhan (China)

    2016-08-15

    Ultra-precision grinding, wire-cut electro discharge machining and lapping are often used to machine the tools in fine blanking industry. And the surface integrity from these machining processes causes great concerns in the research field. To study the effect of processing surface integrity on the fine blanking tool life, the surface integrity of different tool materials under different processing conditions and its influence on fatigue life were thoroughly analyzed in the present study. The result shows that the surface integrity of different materials was quite different on the same processing condition. For the same tool material, the surface integrity on varying processing conditions was quite different too and deeply influenced the fatigue life.

  2. Integrating research tools to support the management of social-ecological systems under climate change

    Science.gov (United States)

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  3. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  4. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  5. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  6. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

    OpenAIRE

    Biggs, Matthew B.; Papin, Jason A.

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid mod...

  7. A Fractionally Integrated Wishart Stochastic Volatility Model

    NARCIS (Netherlands)

    M. Asai (Manabu); M.J. McAleer (Michael)

    2013-01-01

    textabstractThere has recently been growing interest in modeling and estimating alternative continuous time multivariate stochastic volatility models. We propose a continuous time fractionally integrated Wishart stochastic volatility (FIWSV) process. We derive the conditional Laplace transform of

  8. Integration of eHealth Tools in the Process of Workplace Health Promotion: Proposal for Design and Implementation

    Science.gov (United States)

    2018-01-01

    Background Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. Objective To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. Methods We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. Results eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Conclusions Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the

  9. How to define the tool kit for the corrective maintenance service? : a tool kit definition model under the service performance criterion

    NARCIS (Netherlands)

    Chen, Denise

    2009-01-01

    Currently, the rule of defining tool kits is varied and more engineer's aspects oriented. However, the decision of the tool kit's definition is a trade-off problem between the cost and the service performance. This project is designed to develop a model that can integrate the engineer's preferences

  10. Integrating Human Terrain reasoning and tooling in C2 systems

    NARCIS (Netherlands)

    Reus, N. de; Grand, N. le; Kwint, M.; Reniers , F.; Anthonie van Lieburg, A. van

    2010-01-01

    Within an operational staff the ‘core business’ of the Intelligence Cell is to initiate, collect, process, analyze and disseminate relevant information. This Intelligence Preparation of the Environment addresses the environmental evaluation, threat evaluation and results in an integrated overview of

  11. An Integrated Pest Management Tool for Evaluating Schools

    Science.gov (United States)

    Bennett, Blake; Hurley, Janet; Merchant, Mike

    2016-01-01

    Having the ability to assess pest problems in schools is essential for a successful integrated pest management (IPM) program. However, such expertise can be costly and is not available to all school districts across the United States. The web-based IPM Calculator was developed to address this problem. By answering questions about the condition of…

  12. Data requirements for integrated near field models

    International Nuclear Information System (INIS)

    Wilems, R.E.; Pearson, F.J. Jr.; Faust, C.R.; Brecher, A.

    1981-01-01

    The coupled nature of the various processes in the near field require that integrated models be employed to assess long term performance of the waste package and repository. The nature of the integrated near field models being compiled under the SCEPTER program are discussed. The interfaces between these near field models and far field models are described. Finally, near field data requirements are outlined in sufficient detail to indicate overall programmatic guidance for data gathering activities

  13. Map algebra and model algebra for integrated model building

    NARCIS (Netherlands)

    Schmitz, O.; Karssenberg, D.J.; Jong, K. de; Kok, J.-L. de; Jong, S.M. de

    2013-01-01

    Computer models are important tools for the assessment of environmental systems. A seamless workflow of construction and coupling of model components is essential for environmental scientists. However, currently available software packages are often tailored either to the construction of model

  14. Indico central - events organisation, ergonomics and collaboration tools integration

    International Nuclear Information System (INIS)

    Gonzalez Lopez, Jose Benito; Ferreira, Jose Pedro; Baron, Thomas

    2010-01-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  15. Indico central - events organisation, ergonomics and collaboration tools integration

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Lopez, Jose Benito; Ferreira, Jose Pedro; Baron, Thomas, E-mail: jose.benito.gonzalez@cern.c, E-mail: jose.pedro.ferreira@cern.c, E-mail: thomas.baron@cern.c [CERN IT-UDS-AVC, 1211 Geneve 23 (Switzerland)

    2010-04-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  16. Indico Central - Events Organisation, Ergonomics and Collaboration Tools Integration

    CERN Document Server

    Gonzalez Lopez, J B; Baron, T; CERN. Geneva. IT Department

    2010-01-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  17. Integrated Environmental Modelling: Human decisions, human challenges

    Science.gov (United States)

    Glynn, Pierre D.

    2015-01-01

    Integrated Environmental Modelling (IEM) is an invaluable tool for understanding the complex, dynamic ecosystems that house our natural resources and control our environments. Human behaviour affects the ways in which the science of IEM is assembled and used for meaningful societal applications. In particular, human biases and heuristics reflect adaptation and experiential learning to issues with frequent, sharply distinguished, feedbacks. Unfortunately, human behaviour is not adapted to the more diffusely experienced problems that IEM typically seeks to address. Twelve biases are identified that affect IEM (and science in general). These biases are supported by personal observations and by the findings of behavioural scientists. A process for critical analysis is proposed that addresses some human challenges of IEM and solicits explicit description of (1) represented processes and information, (2) unrepresented processes and information, and (3) accounting for, and cognizance of, potential human biases. Several other suggestions are also made that generally complement maintaining attitudes of watchful humility, open-mindedness, honesty and transparent accountability. These suggestions include (1) creating a new area of study in the behavioural biogeosciences, (2) using structured processes for engaging the modelling and stakeholder communities in IEM, and (3) using ‘red teams’ to increase resilience of IEM constructs and use.

  18. System dynamics models as decision-making tools in agritourism

    Directory of Open Access Journals (Sweden)

    Jere Jakulin Tadeja

    2016-12-01

    Full Text Available Agritourism as a type of niche tourism is a complex and softly defined phaenomenon. The demands for fast and integrated decision regarding agritourism and its interconnections with environment, economy (investments, traffic and social factors (tourists is urgent. Many different methodologies and methods master softly structured questions and dilemmas with global and local properties. Here we present methods of systems thinking and system dynamics, which were first brought into force in the educational and training area in the form of different computer simulations and later as tools for decision-making and organisational re-engineering. We develop system dynamics models in order to present accuracy of methodology. These models are essentially simple and can serve only as describers of the activity of basic mutual influences among variables. We will pay the attention to the methodology for parameter model values determination and the so-called mental model. This one is the basis of causal connections among model variables. At the end, we restore a connection between qualitative and quantitative models in frame of system dynamics.

  19. Topology and boundary shape optimization as an integrated design tool

    Science.gov (United States)

    Bendsoe, Martin Philip; Rodrigues, Helder Carrico

    1990-01-01

    The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.

  20. Learning Asset Technology Integration Support Tool Design Document

    Science.gov (United States)

    2010-05-11

    language known as Hypertext Preprocessor ( PHP ) and by MySQL – a relational database management system that can also be used for content management. It...Requirements The LATIST tool will be implemented utilizing a WordPress platform with MySQL as the database. Also the LATIST system must effectively work... MySQL . When designing the LATIST system there are several considerations which must be accounted for in the working prototype. These include: • DAU

  1. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    's widely used in the world. Watershed models can be characterized by the high number of processes associated simulated. The estimation of these processes is also data intensive, requiring data on topography, land use / land cover, agriculture practices, soil type, precipitation, temperature, relative humidity, wind and radiation. Every year new data is being made available namely by satellite, that has allow to improve the quality of model input and also the calibration of the models (Galvão et. al, 2004b). Tools to cope with the vast amount of data have been developed: data formatting, data retrieving, data bases, metadata bases. The high number of processes simulated in watershed models makes them very wide in terms of output. The SWAT model outputs were modified to produce MOHID compliant result files (time series and HDF). These changes maintained the integrity of the original model, thus guarantying that results remain equal to the original version of SWAT. This allowed to output results in MOHID format, thus making it possible to immediately process it with MOHID visualization and data analysis tools (Chambel-Leitão et. al 2007; Trancoso et. al, 2009). Besides SWAT was modified to produce results files in HDF5 format, this allows the visualization of watershed properties (modeled by SWAT) in animated maps using MOHID GIS. The modified version of SWAT described here has been applied to various national and European projects. Results of the application of this modified version of SWAT to estimate hydrology and nutrients loads to estuaries and water bodies will be shown (Chambel-Leitão, 2008; Yarrow & Chambel-Leitão 2008; Chambel-Leitão et. al 2008; Yarrow & P. Chambel-Leitão, 2007; Yarrow & P. Chambel-Leitão, 2007; Coelho et. al., 2008). Keywords: Watershed models, SWAT, MOHID LAND, Hydrology, Nutrient Loads Arnold, J. G. and Fohrer, N. (2005). SWAT2000: current capabilities and research opportunities in applied watershed modeling. Hydrol. Process. 19, 563

  2. SIMULATION TOOLS FOR ELECTRICAL MACHINES MODELLING ...

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. Simulation tools are used both for research and teaching to allow a good ... The solution provide an easy way of determining the dynamic .... incorporate an in-built numerical algorithm, ... to learn, versatile in application, enhanced.

  3. Metadata and Tools for Integration and Preservation of Cultural Heritage 3D Information

    Directory of Open Access Journals (Sweden)

    Achille Felicetti

    2011-12-01

    Full Text Available In this paper we investigate many of the various storage, portability and interoperability issues arising among archaeologists and cultural heritage people when dealing with 3D technologies. On the one side, the available digital repositories look often unable to guarantee affordable features in the management of 3D models and their metadata; on the other side the nature of most of the available data format for 3D encoding seem to be not satisfactory for the necessary portability required nowadays by 3D information across different systems. We propose a set of possible solutions to show how integration can be achieved through the use of well known and wide accepted standards for data encoding and data storage. Using a set of 3D models acquired during various archaeological campaigns and a number of open source tools, we have implemented a straightforward encoding process to generate meaningful semantic data and metadata. We will also present the interoperability process carried out to integrate the encoded 3D models and the geographic features produced by the archaeologists. Finally we will report the preliminary (rather encouraging development of a semantic enabled and persistent digital repository, where 3D models (but also any kind of digital data and metadata can easily be stored, retrieved and shared with the content of other digital archives.

  4. MEASURING INFORMATION INTEGR-ATION MODEL FOR CAD/CMM

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A CAD/CMM workpiece modeling system based on IGES file is proposed. The modeling system is implemented by using a new method for labelling the tolerance items of 3D workpiece. The concept-"feature face" is used in the method. First the CAD data of workpiece are extracted and recognized automatically. Then a workpiece model is generated, which is the integration of pure 3D geometry form with its corresponding inspection items. The principle of workpiece modeling is also presented. At last, the experiment results are shown and correctness of the model is certified.

  5. Property Integration: Componentless Design Techniques and Visualization Tools

    DEFF Research Database (Denmark)

    El-Halwagi, Mahmoud M; Glasgow, I.M.; Eden, Mario Richard

    2004-01-01

    integration is defined as a functionality-based, holistic approach to the allocation and manipulation of streams and processing units, which is based on tracking, adjusting, assigning, and matching functionalities throughout the process. Revised lever arm rules are devised to allow optimal allocation while...... maintaining intra- and interstream conservation of the property-based clusters. The property integration problem is mapped into the cluster domain. This dual problem is solved in terms of clusters and then mapped to the primal problem in the property domain. Several new rules are derived for graphical...... techniques. Particularly, systematic rules and visualization techniques for the identification of optimal mixing of streams and their allocation to units. Furthermore, a derivation of the correspondence between clustering arms and fractional contribution of streams is presented. This correspondence...

  6. Advanced ion trap structures with integrated tools for qubit manipulation

    Science.gov (United States)

    Sterk, J. D.; Benito, F.; Clark, C. R.; Haltli, R.; Highstrete, C.; Nordquist, C. D.; Scott, S.; Stevens, J. E.; Tabakov, B. P.; Tigges, C. P.; Moehring, D. L.; Stick, D.; Blain, M. G.

    2012-06-01

    We survey the ion trap fabrication technologies available at Sandia National Laboratories. These include four metal layers, precision backside etching, and low profile wirebonds. We demonstrate loading of ions in a variety of ion traps that utilize these technologies. Additionally, we present progress towards integration of on-board filtering with trench capacitors, photon collection via an optical cavity, and integrated microwave electrodes for localized hyperfine qubit control and magnetic field gradient quantum gates. [4pt] This work was supported by Sandia's Laboratory Directed Research and Development (LDRD) Program and the Intelligence Advanced Research Projects Activity (IARPA). Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the US Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  7. Making eco logic and models work : An integrative approach to lake ecosystem modelling

    NARCIS (Netherlands)

    Kuiper, Jan Jurjen

    2016-01-01

    Dynamical ecosystem models are important tools that can help ecologists understand complex systems, and turn understanding into predictions of how these systems respond to external changes. This thesis revolves around PCLake, an integrated ecosystem model of shallow lakes that is used by both

  8. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  9. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  10. Integrating Thermal Tools Into the Mechanical Design Process

    Science.gov (United States)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  11. The integration of FMEA with other problem solving tools: A review of enhancement opportunities

    Science.gov (United States)

    Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.

    2017-09-01

    Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.

  12. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    National Research Council Canada - National Science Library

    Petre, P; Visher, J; Shringarpure, R; Valley, F; Swaminathan, M

    2005-01-01

    Automated design tools and integrated design flow methodologies were developed that demonstrated more than an order- of-magnitude reduction in cycle time and cost for mixed signal (digital/analoglRF...

  13. Teacher Models of Technology Integration.

    Science.gov (United States)

    Peterman, Leinda

    2003-01-01

    Provides examples of best practices in technology integration from five Technology Innovation Challenge Grant (TICG) programs, funded through the Department of Education to meet the No Child Left Behind technology goals. Highlights include professional development activities in Louisiana and New Mexico; collaborative learning applications; and…

  14. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  15. The Innsbruck/ESO sky models and telluric correction tools*

    Directory of Open Access Journals (Sweden)

    Kimeswenger S.

    2015-01-01

    While the ground based astronomical observatories just have to correct for the line-of-sight integral of these effects, the Čerenkov telescopes use the atmosphere as the primary detector. The measured radiation originates at lower altitudes and does not pass through the entire atmosphere. Thus, a decent knowledge of the profile of the atmosphere at any time is required. The latter cannot be achieved by photometric measurements of stellar sources. We show here the capabilities of our sky background model and data reduction tools for ground-based optical/infrared telescopes. Furthermore, we discuss the feasibility of monitoring the atmosphere above any observing site, and thus, the possible application of the method for Čerenkov telescopes.

  16. Integrated simulation tools for collimation cleaning in HL-LHC

    CERN Document Server

    Bruce, R; Cerutti, F; Ferrari, A; Lechner, A; Marsili, A; Mirarchi, D; Ortega, P G; Redaelli, S; Rossi, A; Salvachua, B; Sinuela, D P; Tambasco, C; Vlachoudis, V; Mereghetti, A; Assmann, R; Lari, L; Gibson, S M; Nevay, LJ; Appleby, R B; Molson, J; Serluca, M; Barlow, R J; Rafique, H; Toader, A

    2014-01-01

    The Large Hadron Collider is designed to accommodate an unprecedented stored beam energy of 362 MJ in the nominal configuration and about the double in the high-luminosity upgrade HL-LHC that is presently under study. This requires an efficient collimation system to protect the superconducting magnets from quenches. During the design, it is therefore very important to accurately predict the expected beam loss distributions and cleaning efficiency. For this purpose, there are several ongoing efforts in improving the existing simulation tools or developing new ones. This paper gives a brief overview and status of the different available codes.

  17. Picante: R tools for integrating phylogenies and ecology.

    Science.gov (United States)

    Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O

    2010-06-01

    Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).

  18. Integrating information technologies as tools for surgical research.

    Science.gov (United States)

    Schell, Scott R

    2005-10-01

    Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.

  19. An Integrated Tool for Low Thrust Optimal Control Orbit Transfers in Interplanetary Trajectories

    Science.gov (United States)

    Dargent, T.; Martinot, V.

    In the last recent years a significant progress has been made in optimal control orbit transfers using low thrust electrical propulsion for interplanetary missions. The system objective is always the same: decrease the transfer duration and increase the useful satellite mass. The optimum control strategy to perform the minimum time to orbit or the minimum fuel consumption requires the use of sophisticated mathematical tools, most of the time dedicated to a specific mission and therefore hardly reusable. To improve this situation and enable Alcatel Space to perform rather quick trajectory design as requested by mission analysis, we have developed a software tool T-3D dedicated to optimal control orbit transfers which integrates various initial and terminal rendezvous conditions - e.g. fixed arrival time for planet encounter - and engine thrust profiles -e.g. thrust law variation with respect to the distance to the Sun -. This single and quite versatile tool allows to perform analyses like minimum consumption for orbit insertions around a planet from an hyperbolic trajectory, interplanetary orbit transfers, low thrust minimum time multiple revolution orbit transfers, etc… From a mathematical point of view, the software relies on the minimum principle formulation to find the necessary conditions of optimality. The satellite dynamics is a two body model and relies of an equinoctial formulation of the Gauss equation. This choice has been made for numerical purpose and to solve more quickly the two point boundaries values problem. In order to handle the classical problem of co-state variables initialization, problems simpler than the actual one can be solved straight forward by the tool and the values of the co-state variables are kept as first guess for a more complex problem. Finally, a synthesis of the test cases is presented to illustrate the capacities of the tool, mixing examples of interplanetary mission, orbit insertion, multiple revolution orbit transfers

  20. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    Science.gov (United States)

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  1. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  2. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Garcia, Humberto; Burr, Tom; Coles, Garill A.; Edmunds, Thomas A.; Garrett, Alfred; Gorensek, Maximilian; Hamm, Luther; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Tzanos, Constantine P.; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  3. Integration Of Facility Modeling Capabilities For Nuclear Nonproliferation Analysis

    International Nuclear Information System (INIS)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  4. An Integrated Framework to Specify Domain-Specific Modeling Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2018-01-01

    , a logic-based specification language. The drawback of MS DSL Tools is it does not provide a formal and rigorous approach for semantics specifications. In this framework, we use Microsoft DSL Tools to define the metamodel and graphical notations of DSLs, and an extended version of ForSpec as a formal......In this paper, we propose an integrated framework that can be used by DSL designers to implement their desired graphical domain-specific languages. This framework relies on Microsoft DSL Tools, a meta-modeling framework to build graphical domain-specific languages, and an extension of ForSpec...... language to define their semantics. Integrating these technologies under the umbrella of Microsoft Visual Studio IDE allows DSL designers to utilize a single development environment for developing their desired domain-specific languages....

  5. Modeling and Tool Wear in Routing of CFRP

    International Nuclear Information System (INIS)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.; Girot, F.

    2011-01-01

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  6. Open source integrated modeling environment Delta Shell

    Science.gov (United States)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  7. Upgrade and integration of the configuration and monitoring tools for the ATLAS Online farm

    CERN Document Server

    Ballestrero, S; The ATLAS collaboration; Darlea, G L; Dumitru, I; Scannicchio, DA; Twomey, M S; Valsan, M L; Zaytsev, A

    2012-01-01

    The ATLAS Online farm is a non-homogeneous cluster of nearly 3000 PCs which run the data acquisition, trigger and control of the ATLAS detector. The systems are configured and monitored by a combination of open-source tools, such as Quattor and Nagios, and tools developed in-house, such as ConfDB. We report on the ongoing introduction of new provisioning and configuration tools, Puppet and ConfDB v2 which are more flexible and allow automation for previously uncovered needs, and on the upgrade and integration of the monitoring and alerting tools, including the interfacing of these with the TDAQ Shifter Assistant software and their integration with configuration tools. We discuss the selection of the tools and the assessment of their functionality and performance, and how they enabled the introduction of virtualization for selected services.

  8. Upgrade and integration of the configuration and monitoring tools for the ATLAS Online farm

    International Nuclear Information System (INIS)

    Ballestrero, S; Darlea, G–L; Twomey, M S; Brasolin, F; Dumitru, I; Valsan, M L; Scannicchio, D A; Zaytsev, A

    2012-01-01

    The ATLAS Online farm is a non-homogeneous cluster of nearly 3000 systems which run the data acquisition, trigger and control of the ATLAS detector. The systems are configured and monitored by a combination of open-source tools, such as Quattor and Nagios, and tools developed in-house, such as ConfDB. We report on the ongoing introduction of new provisioning and configuration tools, Puppet and ConfDB v2, which are more flexible and allow automation for previously uncovered needs, and on the upgrade and integration of the monitoring and alerting tools, including the interfacing of these with the TDAQ Shifter Assistant software and their integration with configuration tools. We discuss the selection of the tools and the assessment of their functionality and performance, and how they enabled the introduction of virtualization for selected services.

  9. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  10. Integration of Web 2.0 Tools in Learning a Programming Course

    Science.gov (United States)

    Majid, Nazatul Aini Abd

    2014-01-01

    Web 2.0 tools are expected to assist students to acquire knowledge effectively in their university environment. However, the lack of effort from lecturers in planning the learning process can make it difficult for the students to optimize their learning experiences. The aim of this paper is to integrate Web 2.0 tools with learning strategy in…

  11. Application of a faith-based integration tool to assess mental and physical health interventions.

    Science.gov (United States)

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  12. Business Model Innovation: An Integrative Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Bernd Wirtz

    2017-01-01

    Full Text Available Purpose: The point of departure of this exploratory study is the gap between the increasing importance of business model innovation (BMI in science and management and the limited conceptual assistance available. Therefore, the study identi es and explores scattered BMI insights and deduces them into an integrative framework to enhance our understanding about this phenomenon and to present a helpful guidance for researchers and practitioners. Design/Methodology/Approach: The study identi es BMI insights through a literature-based investigation and consolidates them into an integrative BMI framework that presents the key elements and dimensions of BMI as well as their presumed relationships. Findings: The study enhances our understanding about the key elements and dimensions of BMI, presents further conceptual insights into the BMI phenomenon, supplies implications for science and management, and may serve as a helpful guidance for future research. Practical Implications: The presented framework provides managers with a tool to identify critical BMI issues and can serve as a conceptual BMI guideline. Research limitations: Given the vast amount of academic journals, it is unlikely that every applicable scienti c publication is included in the analysis. The illustrative examples are descriptive in nature, and thus do not provide empirical validity. Several implications for future research are provided. Originality/Value: The study’s main contribution lies in the unifying approach of the dispersed BMI knowledge. Since our understanding of BMI is still limited, this study should provide the necessary insights and conceptual assistance to further develop the concept and guide its practical application.

  13. Integrated Baseline System (IBS) Version 2.0: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  14. Student Model Tools Code Release and Documentation

    DEFF Research Database (Denmark)

    Johnson, Matthew; Bull, Susan; Masci, Drew

    of its strengths and areas of improvement (Section 6). Several key appendices are attached to this report including user manuals for teacher and students (Appendix 3). Fundamentally, all relevant information is included in the report for those wishing to do further development work with the tool...

  15. The Webinar Integration Tool: A Framework for Promoting Active Learning in Blended Environments

    Science.gov (United States)

    Lieser, Ping; Taf, Steven D.; Murphy-Hagan, Anne

    2018-01-01

    This paper describes a three-stage process of developing a webinar integration tool to enhance the interaction of teaching and learning in blended environments. In the context of medical education, we emphasize three factors of effective webinar integration in blended learning: fostering better solutions for faculty and students to interact…

  16. Using Modeling Tools to Better Understand Permafrost Hydrology

    Directory of Open Access Journals (Sweden)

    Clément Fabre

    2017-06-01

    Full Text Available Modification of the hydrological cycle and, subsequently, of other global cycles is expected in Arctic watersheds owing to global change. Future climate scenarios imply widespread permafrost degradation caused by an increase in air temperature, and the expected effect on permafrost hydrology is immense. This study aims at analyzing, and quantifying the daily water transfer in the largest Arctic river system, the Yenisei River in central Siberia, Russia, partially underlain by permafrost. The semi-distributed SWAT (Soil and Water Assessment Tool hydrological model has been calibrated and validated at a daily time step in historical discharge simulations for the 2003–2014 period. The model parameters have been adjusted to embrace the hydrological features of permafrost. SWAT is shown capable to estimate water fluxes at a daily time step, especially during unfrozen periods, once are considered specific climatic and soils conditions adapted to a permafrost watershed. The model simulates average annual contribution to runoff of 263 millimeters per year (mm yr−1 distributed as 152 mm yr−1 (58% of surface runoff, 103 mm yr−1 (39% of lateral flow and 8 mm yr−1 (3% of return flow from the aquifer. These results are integrated on a reduced basin area downstream from large dams and are closer to observations than previous modeling exercises.

  17. Integrating Philips' extreme UV source in the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Derra, Guenther; Janssen, Maurice; Jonkers, Jeroen; Klein, Jurgen; Kruecken, Thomas; List, Andreas; Loeken, Michael; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prummer, Ralph; Rosier, Oliver; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2005-05-01

    The paper describes recent progress in the development of the Philips's EUV source. Progress has been realized at many frontiers: Integration studies of the source into a scanner have primarily been studied on the Xe source because it has a high degree of maturity. We report on integration with a collector, associated collector lifetime and optical characteristics. Collector lifetime in excess of 1 bln shots could be demonstrated. Next, an active dose control system was developed and tested on the Xe lamp. Resulting dose stability data are less than 0.2% for an exposure window of 100 pulses. The second part of the paper reports on progress in the development of the Philips' Sn source. First, the details of the concept are described. It is based on a Laser triggered vacuum arc, which is an extension with respect to previous designs. The source is furbished with rotating electrodes that are covered with a Sn film that is constantly regenerated. Hence by the very design of the source, it is scalable to very high power levels, and moreover has fundamentally solved the notorious problem of electrode erosion. Power values of 260 W in 2p sr are reported, along with a stable, long life operation of the lamp. The paper also addresses the problem of debris generation and mitigation of the Sn-source. The problem is attacked by a combined strategy of protection of the collector by traditional means (e.g. fields, foiltraps... ), and by designing the gas atmosphere according to the principles of the well known halogen cycles in incandescent lamps. These principles have been studied in the Lighting industry for decades and rely on the excessively high vapor pressures of metal halides. Transferred to the Sn source, it allows pumping away tin residues that would otherwise irreversibly deposit on the collector.

  18. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  19. QFD: a methodological tool for integration of ergonomics at the design stage.

    Science.gov (United States)

    Marsot, Jacques

    2005-03-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute launched in 1999 a research program on the topic of integrating ergonomics into hand tool design. After a brief review of the problems of integrating ergonomics at the design stage, the paper shows how the "Quality Function Deployment" method has been applied to the design of a boning knife and it highlights the difficulties encountered. Then, it demonstrates how this method can be a methodological tool geared to greater ergonomics consideration in product design.

  20. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  1. Social Ecological Model Analysis for ICT Integration

    Science.gov (United States)

    Zagami, Jason

    2013-01-01

    ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…

  2. Useful tools for non-linear systems: Several non-linear integral inequalities

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mohammadpour, A.; Mesiar, Radko; Vaezpour, M. S.

    2013-01-01

    Roč. 49, č. 1 (2013), s. 73-80 ISSN 0950-7051 R&D Projects: GA ČR GAP402/11/0378 Institutional support: RVO:67985556 Keywords : Monotone measure * Comonotone functions * Integral inequalities * Universal integral Subject RIV: BA - General Mathematics Impact factor: 3.058, year: 2013 http://library.utia.cas.cz/separaty/2013/E/mesiar-useful tools for non-linear systems several non-linear integral inequalities.pdf

  3. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; DeStefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  4. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...... control with the building designer. Consequence based design is defined by the specific use of integrated dynamic modeling, which includes the parametric capabilities of a scripting tool and building simulation features of a building performance simulation tool. The framework can lead to enhanced...

  5. An Integrated Tool for Calculating and Reducing Institution Carbon and Nitrogen Footprints

    Science.gov (United States)

    Galloway, James N.; Castner, Elizabeth A.; Andrews, Jennifer; Leary, Neil; Aber, John D.

    2017-01-01

    Abstract The development of nitrogen footprint tools has allowed a range of entities to calculate and reduce their contribution to nitrogen pollution, but these tools represent just one aspect of environmental pollution. For example, institutions have been calculating their carbon footprints to track and manage their greenhouse gas emissions for over a decade. This article introduces an integrated tool that institutions can use to calculate, track, and manage their nitrogen and carbon footprints together. It presents the methodology for the combined tool, describes several metrics for comparing institution nitrogen and carbon footprint results, and discusses management strategies that reduce both the nitrogen and carbon footprints. The data requirements for the two tools overlap substantially, although integrating the two tools does necessitate the calculation of the carbon footprint of food. Comparison results for five institutions suggest that the institution nitrogen and carbon footprints correlate strongly, especially in the utilities and food sectors. Scenario analyses indicate benefits to both footprints from a range of utilities and food footprint reduction strategies. Integrating these two footprints into a single tool will account for a broader range of environmental impacts, reduce data entry and analysis, and promote integrated management of institutional sustainability. PMID:29350217

  6. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  7. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  8. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plechac, Petr

    2015-01-01

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics

  9. Offshore Wind Farm Clusters - Towards new integrated Design Tool

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Réthoré, Pierre-Elouan; Peña, Alfredo

    In EERA DTOC testing of existing wind farm wake models against four validation data test sets from large offshore wind farms is carried out. This includes Horns Rev-1 in the North Sea, Lillgrund in the Baltic Sea, Roedsand-2 in the Baltic Sea and from 10 large offshore wind farms in Northern Euro...

  10. Integrated tool for NPP lifetime management in Spain

    Energy Technology Data Exchange (ETDEWEB)

    Francia, L. [UNESA, Madrid (Spain); Lopez de Santa Maria, J. [ASCO-Vandellos 2 NPPs l' Hospitalet de l' Infant, Tarragona (Spain); Cardoso, A. [Tecnatom SA, Madrid (Spain)

    2001-07-01

    The project for the Integrated Nuclear Power Plant Lifetime Management System SIGEVI (Sistema Integrado de GEstion de VIda de Centrales Nucleares) was initiated in April 1998 and finalized in December 2000, the main objective of the project being to develop a computer application facilitating the assessment of the condition and lifetime of nuclear power plant components. This constituted the second phase of a further-reaching project on NPP Lifetime Management. During the first phase of this project, carried out between 1992 and 1995, the methodology and strategy for the lifetime management of the Spanish NPP's were developed. Among others, degradation phenomena were assessed and the most adequate methods for their monitoring were defined. The SIGEVI Project has been performed under the management of UNESA (Spanish Electricity Association) and with the collaboration of different engineering firms and research institutes (Tecnatom, Empresarios Agrupados, Ufisa, Initec and IIT), with Vandellos II as the pilot plant. The rest of the Spanish NPP's have also actively participated through the Project Steering Committee. The following sections describe the scope, the structure and the main functionalities of the system SIGEVI. (authors)

  11. Integrated tool for NPP lifetime management in Spain

    International Nuclear Information System (INIS)

    Francia, L.; Lopez de Santa Maria, J.; Cardoso, A.

    2001-01-01

    The project for the Integrated Nuclear Power Plant Lifetime Management System SIGEVI (Sistema Integrado de GEstion de VIda de Centrales Nucleares) was initiated in April 1998 and finalized in December 2000, the main objective of the project being to develop a computer application facilitating the assessment of the condition and lifetime of nuclear power plant components. This constituted the second phase of a further-reaching project on NPP Lifetime Management. During the first phase of this project, carried out between 1992 and 1995, the methodology and strategy for the lifetime management of the Spanish NPP's were developed. Among others, degradation phenomena were assessed and the most adequate methods for their monitoring were defined. The SIGEVI Project has been performed under the management of UNESA (Spanish Electricity Association) and with the collaboration of different engineering firms and research institutes (Tecnatom, Empresarios Agrupados, Ufisa, Initec and IIT), with Vandellos II as the pilot plant. The rest of the Spanish NPP's have also actively participated through the Project Steering Committee. The following sections describe the scope, the structure and the main functionalities of the system SIGEVI. (authors)

  12. MARKET EVALUATION MODEL: TOOL FORBUSINESS DECISIONS

    OpenAIRE

    Porlles Loarte, José; Yenque Dedios, Julio; Lavado Soto, Aurelio

    2014-01-01

    In the present work the concepts of potential market and global market are analyzed as the basis for strategic decisions of market with long term perspectives, when the implantation of a business in certain geographic area is evaluated. On this conceptual frame, the methodological tool is proposed to evaluate a commercial decision, for which it is taken as reference the case from the brewing industry in Peru, considering that this industry faces in the region entrepreneurial reorderings withi...

  13. Integrable lattice models and quantum groups

    International Nuclear Information System (INIS)

    Saleur, H.; Zuber, J.B.

    1990-01-01

    These lectures aim at introducing some basic algebraic concepts on lattice integrable models, in particular quantum groups, and to discuss some connections with knot theory and conformal field theories. The list of contents is: Vertex models and Yang-Baxter equation; Quantum sl(2) algebra and the Yang-Baxter equation; U q sl(2) as a symmetry of statistical mechanical models; Face models; Face models attached to graphs; Yang-Baxter equation, braid group and link polynomials

  14. Slab2 - Updated Subduction Zone Geometries and Modeling Tools

    Science.gov (United States)

    Moore, G.; Hayes, G. P.; Portner, D. E.; Furtney, M.; Flamme, H. E.; Hearne, M. G.

    2017-12-01

    The U.S. Geological Survey database of global subduction zone geometries (Slab1.0), is a highly utilized dataset that has been applied to a wide range of geophysical problems. In 2017, these models have been improved and expanded upon as part of the Slab2 modeling effort. With a new data driven approach that can be applied to a broader range of tectonic settings and geophysical data sets, we have generated a model set that will serve as a more comprehensive, reliable, and reproducible resource for three-dimensional slab geometries at all of the world's convergent margins. The newly developed framework of Slab2 is guided by: (1) a large integrated dataset, consisting of a variety of geophysical sources (e.g., earthquake hypocenters, moment tensors, active-source seismic survey images of the shallow slab, tomography models, receiver functions, bathymetry, trench ages, and sediment thickness information); (2) a dynamic filtering scheme aimed at constraining incorporated seismicity to only slab related events; (3) a 3-D data interpolation approach which captures both high resolution shallow geometries and instances of slab rollback and overlap at depth; and (4) an algorithm which incorporates uncertainties of contributing datasets to identify the most probable surface depth over the extent of each subduction zone. Further layers will also be added to the base geometry dataset, such as historic moment release, earthquake tectonic providence, and interface coupling. Along with access to several queryable data formats, all components have been wrapped into an open source library in Python, such that suites of updated models can be released as further data becomes available. This presentation will discuss the extent of Slab2 development, as well as the current availability of the model and modeling tools.

  15. Geoinformation Systems as a Tool of the Integrated Tourist Spaces Management

    Directory of Open Access Journals (Sweden)

    Kolesnikovich Victor

    2014-09-01

    Full Text Available Introduction. Currently tourist activity management is in need of creating special conditions for the development of integrated management tools based on the general information and analytical base. Material and methods. The creation of architecture and the content of geoinformation and hybrid information systems are oriented at the usage of the Integrated Tourist Spaces Management (ITSM to set up a specific claim related to the features of management model. The authors created the concept of tourist space. The information and the analytical system are used to create the information model of tourist space. Information support development of ITSM system is a sort of a hybrid system: an expert system constructed on the basis of GIS. Results and conclusions. By means of GIS collecting, storage, analysis and graphic visualization of spatial data and the related information on the objects presented in an expert system is provided. The offered approach leads to the formation of an information system and the analytical maintenance of not only human decision-making, but it also promotes the creation of new tourist products based on more and more differentiated inquiries of clients or a ratio of the price and quality (from the point of view of satisfaction of inquiries.

  16. Consequence Based Design. An approach for integrating computational collaborative models (Integrated Dynamic Models) in the building design phase

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    relies on various advancements in the area of integrated dynamic models. It also relies on the application and test of the approach in practice to evaluate the Consequence based design and the use of integrated dynamic models. As a result, the Consequence based design approach has been applied in five...... and define new ways to implement integrated dynamic models for the following project. In parallel, seven different developments of new methods, tools and algorithms have been performed to support the application of the approach. The developments concern: Decision diagrams – to clarify goals and the ability...... affect the design process and collaboration between building designers and simulationists. Within the limits of applying the approach of Consequence based design to five case studies, followed by documentation based on interviews, surveys and project related documentations derived from internal reports...

  17. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  18. MoGIRE: A Model for Integrated Water Management

    Science.gov (United States)

    Reynaud, A.; Leenhardt, D.

    2008-12-01

    Climate change and growing water needs have resulted in many parts of the world in water scarcity problems that must by managed by public authorities. Hence, policy-makers are more and more often asked to define and to implement water allocation rules between competitive users. This requires to develop new tools aiming at designing those rules for various scenarios of context (climatic, agronomic, economic). If models have been developed for each type of water use however, very few integrated frameworks link these different uses, while such an integrated approach is a relevant stake for designing regional water and land policies. The lack of such integrated models can be explained by the difficulty of integrating models developed by very different disciplines and by the problem of scale change (collecting data on large area, arbitrate between the computational tractability of models and their level of aggregation). However, modelers are more and more asked to deal with large basin scales while analyzing some policy impacts at very high detailed levels. These contradicting objectives require to develop new modeling tools. The CALVIN economically-driven optimization model developed for managing water in California is a good example of this type of framework, Draper et al. (2003). Recent reviews of the literature on integrated water management at the basin level include Letcher et al. (2007) or Cai (2008). We present here an original framework for integrated water management at the river basin scale called MoGIRE ("Modèle pour la Gestion Intégrée de la Ressource en Eau"). It is intended to optimize water use at the river basin level and to evaluate scenarios (agronomic, climatic or economic) for a better planning of agricultural and non-agricultural water use. MoGIRE includes a nodal representation of the water network. Agricultural, urban and environmental water uses are also represented using mathematical programming and econometric approaches. The model then

  19. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  20. Water footprint as a tool for integrated water resources management

    Science.gov (United States)

    Aldaya, Maite; Hoekstra, Arjen

    2010-05-01

    together with the water footprint concept could thus provide an appropriate framework to support more optimal water management practices by informing production and trade decisions and the development and adoption of water efficient technology. In order to move towards better water governance however a further integration of water-related concerns into water-related sectoral policies is paramount. This will require a concerted effort by all stakeholders, the willingness to adopt a total resource view where water is seen as a key, cross-sectoral input for development and growth, a mix of technical approaches, and the courage to undertake and fund water sector reforms. We are convinced that the water footprint analysis can provide a sufficiently robust fact base for meaningful stakeholder dialogue and action towards solutions.

  1. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  2. Integrated Decision Tools for Sustainable Watershed/Ground Water and Crop Health using Predictive Weather, Remote Sensing, and Irrigation Decision Tools

    Science.gov (United States)

    Jones, A. S.; Andales, A.; McGovern, C.; Smith, G. E. B.; David, O.; Fletcher, S. J.

    2017-12-01

    US agricultural and Govt. lands have a unique co-dependent relationship, particularly in the Western US. More than 30% of all irrigated US agricultural output comes from lands sustained by the Ogallala Aquifer in the western Great Plains. Six US Forest Service National Grasslands reside within the aquifer region, consisting of over 375,000 ha (3,759 km2) of USFS managed lands. Likewise, National Forest lands are the headwaters to many intensive agricultural regions. Our Ogallala Aquifer team is enhancing crop irrigation decision tools with predictive weather and remote sensing data to better manage water for irrigated crops within these regions. An integrated multi-model software framework is used to link irrigation decision tools, resulting in positive management benefits on natural water resources. Teams and teams-of-teams can build upon these multi-disciplinary multi-faceted modeling capabilities. For example, the CSU Catalyst for Innovative Partnerships program has formed a new multidisciplinary team that will address "Rural Wealth Creation" focusing on the many integrated links between economic, agricultural production and management, natural resource availabilities, and key social aspects of govt. policy recommendations. By enhancing tools like these with predictive weather and other related data (like in situ measurements, hydrologic models, remotely sensed data sets, and (in the near future) linking to agro-economic and life cycle assessment models) this work demonstrates an integrated data-driven future vision of inter-meshed dynamic systems that can address challenging multi-system problems. We will present the present state of the work and opportunities for future involvement.

  3. An artificial intelligence tool for complex age-depth models

    Science.gov (United States)

    Bradley, E.; Anderson, K. A.; de Vesine, L. R.; Lai, V.; Thomas, M.; Nelson, T. H.; Weiss, I.; White, J. W. C.

    2017-12-01

    CSciBox is an integrated software system for age modeling of paleoenvironmental records. It incorporates an array of data-processing and visualization facilities, ranging from 14C calibrations to sophisticated interpolation tools. Using CSciBox's GUI, a scientist can build custom analysis pipelines by composing these built-in components or adding new ones. Alternatively, she can employ CSciBox's automated reasoning engine, Hobbes, which uses AI techniques to perform an in-depth, autonomous exploration of the space of possible age-depth models and presents the results—both the models and the reasoning that was used in constructing and evaluating them—to the user for her inspection. Hobbes accomplishes this using a rulebase that captures the knowledge of expert geoscientists, which was collected over the course of more than 100 hours of interviews. It works by using these rules to generate arguments for and against different age-depth model choices for a given core. Given a marine-sediment record containing uncalibrated 14C dates, for instance, Hobbes tries CALIB-style calibrations using a choice of IntCal curves, with reservoir age correction values chosen from the 14CHRONO database using the lat/long information provided with the core, and finally composes the resulting age points into a full age model using different interpolation methods. It evaluates each model—e.g., looking for outliers or reversals—and uses that information to guide the next steps of its exploration, and presents the results to the user in human-readable form. The most powerful of CSciBox's built-in interpolation methods is BACON, a Bayesian sedimentation-rate algorithm—a powerful but complex tool that can be difficult to use. Hobbes adjusts BACON's many parameters autonomously to match the age model to the expectations of expert geoscientists, as captured in its rulebase. It then checks the model against the data and iteratively re-calculates until it is a good fit to the data.

  4. Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)

    Science.gov (United States)

    Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David

    2018-01-01

    Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of

  5. Ontology modeling in physical asset integrity management

    CERN Document Server

    Yacout, Soumaya

    2015-01-01

    This book presents cutting-edge applications of, and up-to-date research on, ontology engineering techniques in the physical asset integrity domain. Though a survey of state-of-the-art theory and methods on ontology engineering, the authors emphasize essential topics including data integration modeling, knowledge representation, and semantic interpretation. The book also reflects novel topics dealing with the advanced problems of physical asset integrity applications such as heterogeneity, data inconsistency, and interoperability existing in design and utilization. With a distinctive focus on applications relevant in heavy industry, Ontology Modeling in Physical Asset Integrity Management is ideal for practicing industrial and mechanical engineers working in the field, as well as researchers and graduate concerned with ontology engineering in physical systems life cycles. This book also: Introduces practicing engineers, research scientists, and graduate students to ontology engineering as a modeling techniqu...

  6. Open discovery: An integrated live Linux platform of Bioinformatics tools.

    Science.gov (United States)

    Vetrivel, Umashankar; Pilla, Kalabharath

    2008-01-01

    Historically, live linux distributions for Bioinformatics have paved way for portability of Bioinformatics workbench in a platform independent manner. Moreover, most of the existing live Linux distributions limit their usage to sequence analysis and basic molecular visualization programs and are devoid of data persistence. Hence, open discovery - a live linux distribution has been developed with the capability to perform complex tasks like molecular modeling, docking and molecular dynamics in a swift manner. Furthermore, it is also equipped with complete sequence analysis environment and is capable of running windows executable programs in Linux environment. Open discovery portrays the advanced customizable configuration of fedora, with data persistency accessible via USB drive or DVD. The Open Discovery is distributed free under Academic Free License (AFL) and can be downloaded from http://www.OpenDiscovery.org.in.

  7. An integrated urban drainage system model for assessing renovation scheme.

    Science.gov (United States)

    Dong, X; Zeng, S; Chen, J; Zhao, D

    2012-01-01

    Due to sustained economic growth in China over the last three decades, urbanization has been on a rapidly expanding track. In recent years, regional industrial relocations were also accelerated across the country from the east coast to the west inland. These changes have led to a large-scale redesign of urban infrastructures, including the drainage system. To help the reconstructed infrastructures towards a better sustainability, a tool is required for assessing the efficiency and environmental performance of different renovation schemes. This paper developed an integrated dynamic modeling tool, which consisted of three models for describing the sewer, the wastewater treatment plant (WWTP) and the receiving water body respectively. Three auxiliary modules were also incorporated to conceptualize the model, calibrate the simulations, and analyze the results. The developed integrated modeling tool was applied to a case study in Shenzhen City, which is one of the most dynamic cities and facing considerable challenges for environmental degradation. The renovation scheme proposed to improve the environmental performance of Shenzhen City's urban drainage system was modeled and evaluated. The simulation results supplied some suggestions for the further improvement of the renovation scheme.

  8. MODexplorer: an integrated tool for exploring protein sequence, structure and function relationships.

    KAUST Repository

    Kosinski, Jan

    2013-02-08

    SUMMARY: MODexplorer is an integrated tool aimed at exploring the sequence, structural and functional diversity in protein families useful in homology modeling and in analyzing protein families in general. It takes as input either the sequence or the structure of a protein and provides alignments with its homologs along with a variety of structural and functional annotations through an interactive interface. The annotations include sequence conservation, similarity scores, ligand-, DNA- and RNA-binding sites, secondary structure, disorder, crystallographic structure resolution and quality scores of models implied by the alignments to the homologs of known structure. MODexplorer can be used to analyze sequence and structural conservation among the structures of similar proteins, to find structures of homologs solved in different conformational state or with different ligands and to transfer functional annotations. Furthermore, if the structure of the query is not known, MODexplorer can be used to select the modeling templates taking all this information into account and to build a comparative model. AVAILABILITY AND IMPLEMENTATION: Freely available on the web at http://modorama.biocomputing.it/modexplorer. Website implemented in HTML and JavaScript with all major browsers supported. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

  9. MODexplorer: an integrated tool for exploring protein sequence, structure and function relationships.

    KAUST Repository

    Kosinski, Jan; Barbato, Alessandro; Tramontano, Anna

    2013-01-01

    SUMMARY: MODexplorer is an integrated tool aimed at exploring the sequence, structural and functional diversity in protein families useful in homology modeling and in analyzing protein families in general. It takes as input either the sequence or the structure of a protein and provides alignments with its homologs along with a variety of structural and functional annotations through an interactive interface. The annotations include sequence conservation, similarity scores, ligand-, DNA- and RNA-binding sites, secondary structure, disorder, crystallographic structure resolution and quality scores of models implied by the alignments to the homologs of known structure. MODexplorer can be used to analyze sequence and structural conservation among the structures of similar proteins, to find structures of homologs solved in different conformational state or with different ligands and to transfer functional annotations. Furthermore, if the structure of the query is not known, MODexplorer can be used to select the modeling templates taking all this information into account and to build a comparative model. AVAILABILITY AND IMPLEMENTATION: Freely available on the web at http://modorama.biocomputing.it/modexplorer. Website implemented in HTML and JavaScript with all major browsers supported. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

  10. Integrating systems biology models and biomedical ontologies.

    Science.gov (United States)

    Hoehndorf, Robert; Dumontier, Michel; Gennari, John H; Wimalaratne, Sarala; de Bono, Bernard; Cook, Daniel L; Gkoutos, Georgios V

    2011-08-11

    Systems biology is an approach to biology that emphasizes the structure and dynamic behavior of biological systems and the interactions that occur within them. To succeed, systems biology crucially depends on the accessibility and integration of data across domains and levels of granularity. Biomedical ontologies were developed to facilitate such an integration of data and are often used to annotate biosimulation models in systems biology. We provide a framework to integrate representations of in silico systems biology with those of in vivo biology as described by biomedical ontologies and demonstrate this framework using the Systems Biology Markup Language. We developed the SBML Harvester software that automatically converts annotated SBML models into OWL and we apply our software to those biosimulation models that are contained in the BioModels Database. We utilize the resulting knowledge base for complex biological queries that can bridge levels of granularity, verify models based on the biological phenomenon they represent and provide a means to establish a basic qualitative layer on which to express the semantics of biosimulation models. We establish an information flow between biomedical ontologies and biosimulation models and we demonstrate that the integration of annotated biosimulation models and biomedical ontologies enables the verification of models as well as expressive queries. Establishing a bi-directional information flow between systems biology and biomedical ontologies has the potential to enable large-scale analyses of biological systems that span levels of granularity from molecules to organisms.

  11. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  12. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  13. New tools for aquatic habitat modeling

    Science.gov (United States)

    D. Tonina; J. A. McKean; C. Tang; P. Goodwin

    2011-01-01

    Modeling of aquatic microhabitat in streams has been typically done over short channel reaches using one-dimensional simulations, partly because of a lack of high resolution. subaqueous topographic data to better define model boundary conditions. The Experimental Advanced Airborne Research Lidar (EAARL) is an airborne aquatic-terrestrial sensor that allows simultaneous...

  14. Experiences in applying Bayesian integrative models in interdisciplinary modeling: the computational and human challenges

    DEFF Research Database (Denmark)

    Kuikka, Sakari; Haapasaari, Päivi Elisabet; Helle, Inari

    2011-01-01

    We review the experience obtained in using integrative Bayesian models in interdisciplinary analysis focusing on sustainable use of marine resources and environmental management tasks. We have applied Bayesian models to both fisheries and environmental risk analysis problems. Bayesian belief...... be time consuming and research projects can be difficult to manage due to unpredictable technical problems related to parameter estimation. Biology, sociology and environmental economics have their own scientific traditions. Bayesian models are becoming traditional tools in fisheries biology, where...

  15. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  16. Modeling energy-economy interactions using integrated models

    International Nuclear Information System (INIS)

    Uyterlinde, M.A.

    1994-06-01

    Integrated models are defined as economic energy models that consist of several submodels, either coupled by an interface module, or embedded in one large model. These models can be used for energy policy analysis. Using integrated models yields the following benefits. They provide a framework in which energy-economy interactions can be better analyzed than in stand-alone models. Integrated models can represent both energy sector technological details, as well as the behaviour of the market and the role of prices. Furthermore, the combination of modeling methodologies in one model can compensate weaknesses of one approach with strengths of another. These advantages motivated this survey of the class of integrated models. The purpose of this literature survey therefore was to collect and to present information on integrated models. To carry out this task, several goals were identified. The first goal was to give an overview of what is reported on these models in general. The second one was to find and describe examples of such models. Other goals were to find out what kinds of models were used as component models, and to examine the linkage methodology. Solution methods and their convergence properties were also a subject of interest. The report has the following structure. In chapter 2, a 'conceptual framework' is given. In chapter 3 a number of integrated models is described. In a table, a complete overview is presented of all described models. Finally, in chapter 4, the report is summarized, and conclusions are drawn regarding the advantages and drawbacks of integrated models. 8 figs., 29 refs

  17. Integrated Heat Air & Moisture Modeling and control

    NARCIS (Netherlands)

    Schijndel, van A.W.M.

    2007-01-01

    The paper presents a recently developed Heat Air & Moisture Laboratory in SimuLink. The simulation laboratory facilitates the integration of the following models: (1) a whole building model; (2) Heating Venting and Air-Conditioning and primary systems; (3) 2D indoor airflow, 3D Heat Air & Moisture

  18. Development of a generalized integral jet model

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan; Kessler, A.; Markert, Frank

    2017-01-01

    Integral type models to describe stationary plumes and jets in cross-flows (wind) have been developed since about 1970. These models are widely used for risk analysis, to describe the consequences of many different scenarios. Alternatively, CFD codes are being applied, but computational requireme......Integral type models to describe stationary plumes and jets in cross-flows (wind) have been developed since about 1970. These models are widely used for risk analysis, to describe the consequences of many different scenarios. Alternatively, CFD codes are being applied, but computational...... requirements still limit the number of scenarios that can be dealt with using CFD only. The integral models, however, are not suited to handle transient releases, such as releases from pressurized equipment, where the initially high release rate decreases rapidly with time. Further, on gas ignition, a second...... model is needed to describe the rapid combustion of the flammable part of the plume (flash fire) and a third model has to be applied for the remaining jet fire. The objective of this paper is to describe the first steps of the development of an integral-type model describing the transient development...

  19. Jack Human Modelling Tool: A Review

    Science.gov (United States)

    2010-01-01

    design and evaluation [8] and evolved into the Computerised Biomechanical Man Model (Combiman), shown in Figure 2. Combiman was developed at the...unrealistic arrangement of tetrahedra (Figure 7) to a highly realistic human model based on current anthropometric, anatomical and biomechanical data...has long legs and a short torso may find it difficult to adjust the seat and rudder pedals to achieve the required over the nose vision, reach to

  20. SQL Server 2012 data integration recipes solutions for integration services and other ETL tools

    CERN Document Server

    Aspin, Adam

    2012-01-01

    SQL Server 2012 Data Integration Recipes provides focused and practical solutions to real world problems of data integration. Need to import data into SQL Server from an outside source? Need to export data and send it to another system? SQL Server 2012 Data Integration Recipes has your back. You'll find solutions for importing from Microsoft Office data stores such as Excel and Access, from text files such as CSV files, from XML, from other database brands such as Oracle and MySQL, and even from other SQL Server databases. You'll learn techniques for managing metadata, transforming data to mee

  1. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    Science.gov (United States)

    Kessler, H.; Giles, J. R.

    2010-12-01

    data to groundwater models but these models are only aimed at solving one specific part of the earth’s system, e.g. the flow of groundwater to an abstraction borehole or the availability of water for irrigation. Particular problems arise when model data from two or more disciplines are incompatible in terms of data formats, scientific concepts or language. Other barriers include the cultural segregation within and between science disciplines as well as impediments to data exchange due to ownership and copyright restrictions. OpenMI and GeoSciML are initiatives that are trying to overcome these barriers by building international communities that share vocabularies and data formats. This paper gives examples of the successful merging of geological and hydrological models from the UK and will introduce the vision of an open Environmental Modelling Platform which aims to link data, knowledge and concepts seamlessly to numerical process models. Last but not least there is an urgent need to create a Subsurface Information System akin to a Geographic Information System in which all results of subsurface modelling can be visualised and analysed in an integrated manner and thereby become useful for decision makers.

  2. A comparison of tools for modeling freshwater ecosystem services.

    Science.gov (United States)

    Vigerstol, Kari L; Aukema, Juliann E

    2011-10-01

    Interest in ecosystem services has grown tremendously among a wide range of sectors, including government agencies, NGO's and the business community. Ecosystem services entailing freshwater (e.g. flood control, the provision of hydropower, and water supply), as well as carbon storage and sequestration, have received the greatest attention in both scientific and on-the-ground applications. Given the newness of the field and the variety of tools for predicting water-based services, it is difficult to know which tools to use for different questions. There are two types of freshwater-related tools--traditional hydrologic tools and newer ecosystem services tools. Here we review two of the most prominent tools of each type and their possible applications. In particular, we compare the data requirements, ease of use, questions addressed, and interpretability of results among the models. We discuss the strengths, challenges and most appropriate applications of the different models. Traditional hydrological tools provide more detail whereas ecosystem services tools tend to be more accessible to non-experts and can provide a good general picture of these ecosystem services. We also suggest gaps in the modeling toolbox that would provide the greatest advances by improving existing tools. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    Science.gov (United States)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  4. An Assessment Tool to Integrate Sustainability Principles into the Global Supply Chain

    Directory of Open Access Journals (Sweden)

    María Jesús Muñoz-Torres

    2018-02-01

    Full Text Available The integration of sustainability principles into the assessment of companies along the supply chains is a growing research area. However, there is an absence of a generally accepted method to evaluate corporate sustainability performance (CSP, and the models and frameworks proposed by the literature present various important challenges to be addressed. A systematic literature review on the supply chain at the corporate level has been conducted, analyzing the main strengths and gaps in the sustainability assessment literature. Therefore, this paper aims to contribute to the development of this field by proposing an assessment framework a leading company can adopt to expand sustainability principles to the rest of the members of the supply chain. This proposal is based on best practices and integrates and shares efforts with key initiatives (for instance, the Organizational Environmental Footprint from the European Commission and United Nations Environment Programme and the Society of Environmental Toxicology and Chemistry UNEP/SETAC; moreover, it overcomes important limitations of the current sustainability tools in a supply chain context consistent with the circular economy, the Sustainable Development Goals (SDGs, planetary boundaries, and social foundation requirements. The results obtained create, on the one hand, new opportunities for academics; and, on the other hand, in further research, the use of this framework could be a means of actively engaging companies in their supply chains and of achieving the implementation of practical and comprehensive CSP assessment.

  5. Opportunites for Integrated Landscape Planning – the Broker, the Arena, the Tool

    Directory of Open Access Journals (Sweden)

    Julia Carlsson

    2017-12-01

    Full Text Available As an integrated social and ecological system, the forest landscape includes multiple values. The need for a landscape pproach in land use planning is being increasingly advocated in research, policy and practice. This paper explores how institutional conditions in the forest policy and management sector can be developed to meet demands for a multifunctional landscape perspective. Departing from obstacles recognised in collaborative planning literature, we build an analytical framework which is operationalised in a Swedish context at municipal level. Our case illustrating this is Vilhelmina Model Forest, where actual barriers and opportunities for a multiple-value landscape approach are identified through 32 semi-structured interviews displaying stakeholders’ views on forest values,ownership rights and willingness to consider multiple values, forest policy and management premises, and collaboration. As an opportunity to overcome the barriers, we suggest and discuss three key components by which an integrated landscape planning approach could be realized in forest management planning: the need for a landscape coordinator (broker, the need for a collaborative forum (arena, and the development of the existing forest management plan into an advanced multifunctional landscape plan (tool.

  6. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  7. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  8. Conceptual model of integrated apiarian consultancy

    OpenAIRE

    Bodescu, Dan; Stefan, Gavril; Paveliuc Olariu, Codrin; Magdici, Maria

    2010-01-01

    The socio-economic field researches have indicated the necessity of realizing an integrated consultancy service for beekeepers that will supply technical-economic solutions with a practical character for ensuring the lucrativeness and viability of the apiaries. Consequently, an integrated apiarian consultancy model has been built holding the following features: it realizes the diagnosis of the meliferous resources and supplies solutions for its optimal administration; it realizes the technica...

  9. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an

  10. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  11. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  12. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  13. Molecular dynamics simulation of subnanometric tool-workpiece contact on a force sensor-integrated fast tool servo for ultra-precision microcutting

    International Nuclear Information System (INIS)

    Cai, Yindi; Chen, Yuan-Liu; Shimizu, Yuki; Ito, So; Gao, Wei; Zhang, Liangchi

    2016-01-01

    Highlights: • Subnanometric contact between a diamond tool and a copper workpiece surface is investigated by MD simulation. • A multi-relaxation time technique is proposed to eliminate the influence of the atom vibrations. • The accuracy of the elastic-plastic transition contact depth estimation is improved by observing the residual defects. • The simulation results are beneficial for optimization of the next-generation microcutting instruments. - Abstract: This paper investigates the contact characteristics between a copper workpiece and a diamond tool in a force sensor-integrated fast tool servo (FS-FTS) for single point diamond microcutting and in-process measurement of ultra-precision surface forms of the workpiece. Molecular dynamics (MD) simulations are carried out to identify the subnanometric elastic-plastic transition contact depth, at which the plastic deformation in the workpiece is initiated. This critical depth can be used to optimize the FS-FTS as well as the cutting/measurement process. It is clarified that the vibrations of the copper atoms in the MD model have a great influence on the subnanometric MD simulation results. A multi-relaxation time method is then proposed to reduce the influence of the atom vibrations based on the fact that the dominant vibration component has a certain period determined by the size of the MD model. It is also identified that for a subnanometric contact depth, the position of the tool tip for the contact force to be zero during the retracting operation of the tool does not correspond to the final depth of the permanent contact impression on the workpiece surface. The accuracy for identification of the transition contact depth is then improved by observing the residual defects on the workpiece surface after the tool retracting.

  14. Knowledge modelling and reliability processing: presentation of the Figaro language and associated tools

    International Nuclear Information System (INIS)

    Bouissou, M.; Villatte, N.; Bouhadana, H.; Bannelier, M.

    1991-12-01

    EDF has been developing for several years an integrated set of knowledge-based and algorithmic tools for automation of reliability assessment of complex (especially sequential) systems. In this environment, the reliability expert has at his disposal all the powerful software tools for qualitative and quantitative processing, besides he gets various means to generate automatically the inputs for these tools, through the acquisition of graphical data. The development of these tools has been based on FIGARO, a specific language, which was built to get an homogeneous system modelling. Various compilers and interpreters get a FIGARO model into conventional models, such as fault-trees, Markov chains, Petri Networks. In this report, we introduce the main basics of FIGARO language, illustrating them with examples

  15. Using registries to integrate bioinformatics tools and services into workbench environments

    DEFF Research Database (Denmark)

    Ménager, Hervé; Kalaš, Matúš; Rapacki, Kristoffer

    2016-01-01

    The diversity and complexity of bioinformatics resources presents significant challenges to their localisation, deployment and use, creating a need for reliable systems that address these issues. Meanwhile, users demand increasingly usable and integrated ways to access and analyse data, especially......, a software component that will ease the integration of bioinformatics resources in a workbench environment, using their description provided by the existing ELIXIR Tools and Data Services Registry....

  16. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  17. An integrated model of the lithium/thionyl chloride battery

    Energy Technology Data Exchange (ETDEWEB)

    Jungst, R.G.; Nagasubramanian, G.; Ingersoll, D.; O`Gorman, C.C.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States); Jain, M.; Weidner, J.W. [Univ. of South Carolina, Columbia, SC (United States)

    1998-06-08

    The desire to reduce the time and cost of design engineering on new components or to validate existing designs in new applications is stimulating the development of modeling and simulation tools. The authors are applying a model-based design approach to low and moderate rate versions of the Li/SOCl{sub 2} D-size cell with success. Three types of models are being constructed and integrated to achieve maximum capability and flexibility in the final simulation tool. A phenomenology based electrochemical model links performance and the cell design, chemical processes, and material properties. An artificial neural network model improves computational efficiency and fills gaps in the simulation capability when fundamental cell parameters are too difficult to measure or the forms of the physical relationships are not understood. Finally, a PSpice-based model provides a simple way to test the cell under realistic electrical circuit conditions. Integration of these three parts allows a complete link to be made between fundamental battery design characteristics and the performance of the rest of the electrical subsystem.

  18. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  19. Integrated facilities modeling using QUEST and IGRIP

    International Nuclear Information System (INIS)

    Davis, K.R.; Haan, E.R.

    1995-01-01

    A QUEST model and associated detailed IGRIP models were developed and used to simulate several workcells in a proposed Plutonium Storage Facility (PSF). The models are being used by team members assigned to the program to improve communication and to assist in evaluating concepts and in performing trade-off studies which will result in recommendations and a final design. The model was designed so that it could be changed easily. The added flexibility techniques used to make changes easily are described in this paper in addition to techniques for integrating the QUEST and IGRIP products. Many of these techniques are generic in nature and can be applied to any modeling endeavor

  20. Electricity market models and RES integration: The Greek case

    International Nuclear Information System (INIS)

    Simoglou, Christos K.; Biskas, Pandelis N.; Vagropoulos, Stylianos I.; Bakirtzis, Anastasios G.

    2014-01-01

    This paper presents an extensive analysis of the Greek electricity market for the next 7-year period (2014–2020) based on an hour-by-hour simulation considering five different RES technologies, namely wind, PV, small hydro, biomass and CHP with emphasis on PV integration. The impact of RES penetration on the electricity market operation is evaluated under two different models regarding the organization of the Greek wholesale day-ahead electricity market: a mandatory power pool for year 2014 (current market design) and a power exchange for the period 2015–2020 (Target Model). An integrated software tool is used for the simulation of the current and the future day-ahead market clearing algorithm of the Greek wholesale electricity market. Simulation results indicate the impact of the anticipated large-scale RES integration, in conjunction with each market model, on specific indicators of the Greek electricity market in the long-term. - Highlights: • Analysis of the Greek electricity market for the next 7-year period (2014–2020) based on hour-by-hour simulation. • Five different RES technologies are considered with emphasis on PV integration. • A power pool (for 2014) and a power exchange (for 2015–2020) are considered. • Various market indicators are used for the analysis of the impact of the RES integration on the Greek electricity market. • Two alternative tariff schemes for the compensation of the new ground-mounted PV units from 2015 onwards are investigated

  1. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  2. Mixed-Dimensionality VLSI-Type Configurable Tools for Virtual Prototyping of Biomicrofluidic Devices and Integrated Systems

    Science.gov (United States)

    Makhijani, Vinod B.; Przekwas, Andrzej J.

    2002-10-01

    This report presents results of a DARPA/MTO Composite CAD Project aimed to develop a comprehensive microsystem CAD environment, CFD-ACE+ Multiphysics, for bio and microfluidic devices and complete microsystems. The project began in July 1998, and was a three-year team effort between CFD Research Corporation, California Institute of Technology (CalTech), University of California, Berkeley (UCB), and Tanner Research, with Mr. Don Verlee from Abbott Labs participating as a consultant on the project. The overall objective of this project was to develop, validate and demonstrate several applications of a user-configurable VLSI-type mixed-dimensionality software tool for design of biomicrofluidics devices and integrated systems. The developed tool would provide high fidelity 3-D multiphysics modeling capability, l-D fluidic circuits modeling, and SPICE interface for system level simulations, and mixed-dimensionality design. It would combine tools for layouts and process fabrication, geometric modeling, and automated grid generation, and interfaces to EDA tools (e.g. Cadence) and MCAD tools (e.g. ProE).

  3. Fluid Survival Tool: A Model Checker for Hybrid Petri Nets

    NARCIS (Netherlands)

    Postema, Björn Frits; Remke, Anne Katharina Ingrid; Haverkort, Boudewijn R.H.M.; Ghasemieh, Hamed

    2014-01-01

    Recently, algorithms for model checking Stochastic Time Logic (STL) on Hybrid Petri nets with a single general one-shot transition (HPNG) have been introduced. This paper presents a tool for model checking HPNG models against STL formulas. A graphical user interface (GUI) not only helps to

  4. Integrated modelling of two xenobiotic organic compounds

    DEFF Research Database (Denmark)

    Lindblom, Erik Ulfson; Gernaey, K.V.; Henze, Mogens

    2006-01-01

    This paper presents a dynamic mathematical model that describes the fate and transport of two selected xenobiotic organic compounds (XOCs) in a simplified representation. of an integrated urban wastewater system. A simulation study, where the xenobiotics bisphenol A and pyrene are used as reference...... compounds, is carried out. Sorption and specific biological degradation processes are integrated with standardised water process models to model the fate of both compounds. Simulated mass flows of the two compounds during one dry weather day and one wet weather day are compared for realistic influent flow...... rate and concentration profiles. The wet weather day induces resuspension of stored sediments, which increases the pollutant load on the downstream system. The potential of the model to elucidate important phenomena related to origin and fate of the model compounds is demonstrated....

  5. An integrative model of organizational safety behavior.

    Science.gov (United States)

    Cui, Lin; Fan, Di; Fu, Gui; Zhu, Cherrie Jiuhua

    2013-06-01

    This study develops an integrative model of safety management based on social cognitive theory and the total safety culture triadic framework. The purpose of the model is to reveal the causal linkages between a hazardous environment, safety climate, and individual safety behaviors. Based on primary survey data from 209 front-line workers in one of the largest state-owned coal mining corporations in China, the model is tested using structural equation modeling techniques. An employee's perception of a hazardous environment is found to have a statistically significant impact on employee safety behaviors through a psychological process mediated by the perception of management commitment to safety and individual beliefs about safety. The integrative model developed here leads to a comprehensive solution that takes into consideration the environmental, organizational and employees' psychological and behavioral aspects of safety management. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  6. Modern model of integrated corporate communication

    Directory of Open Access Journals (Sweden)

    Milica Slijepčević

    2018-03-01

    Full Text Available The main purpose of this paper is to present the modern model of integrated corporate communication. Beside this, the authors will describe the changes occurring in the corporate environment and importance of changing the model of corporate communication. This paper also discusses the importance of implementation of the suggested model, the use of new media and effects of these changes on corporations. The approach used in this paper is the literature review. The authors explore the importance of implementation of the suggested model and the new media in corporate communication, both internal and external, addressing all the stakeholders and communication contents. The paper recommends implementation of a modern model of integrated corporate communication as a response to constant development of the new media and generation changes taking place. Practical implications: the modern model of integrated corporate communication can be used as an upgrade of the conventional communication models. This modern model empowers companies to sustain and build up the existing relationships with stakeholders, and to find out and create new relationships with stakeholders who were previously inaccessible and invisible.

  7. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  8. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  9. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL, th...... model transformation tool sharing the model editor’s benefits, transparently....

  10. Tools for Resilience Management: Multidisciplinary Development of State-and-Transition Models for Northwest Colorado

    Directory of Open Access Journals (Sweden)

    Emily J. Kachergis

    2013-12-01

    Full Text Available Building models is an important way of integrating knowledge. Testing and updating models of social-ecological systems can inform management decisions and, ultimately, improve resilience. We report on the outcomes of a six-year, multidisciplinary model development process in the sagebrush steppe, USA. We focused on creating state-and-transition models (STMs, conceptual models of ecosystem change that represent nonlinear dynamics and are being adopted worldwide as tools for managing ecosystems. STM development occurred in four steps with four distinct sets of models: (1 local knowledge elicitation using semistructured interviews; (2 ecological data collection using an observational study; (3 model integration using participatory workshops; and (4 model simplification upon review of the literature by a multidisciplinary team. We found that different knowledge types are ultimately complementary. Many of the benefits of the STM-building process flowed from the knowledge integration steps, including improved communication, identification of uncertainties, and production of more broadly credible STMs that can be applied in diverse situations. The STM development process also generated hypotheses about sagebrush steppe dynamics that could be tested by future adaptive management and research. We conclude that multidisciplinary development of STMs has great potential for producing credible, useful tools for managing resilience of social-ecological systems. Based on this experience, we outline a streamlined, participatory STM development process that integrates multiple types of knowledge and incorporates adaptive management.

  11. Renewed mer model of integral management

    Directory of Open Access Journals (Sweden)

    Janko Belak

    2015-12-01

    Full Text Available Background: The research work on entrepreneurship, enterprise's policy and management, which started in 1992, successfully continued in the following years. Between 1992 and 2011, more than 400 academics and other researchers have participated in research work (MER research program whose main orientation has been the creation of their own model of integral management. Results: In past years, academics (researchers and authors of published papers from Austria, Belgium, Bosnia and Herzegovina, Bulgaria, Byelorussia, Canada, the Czech Republic, Croatia, Estonia, France, Germany, Hungary, Italy, Poland, Romania, Russia, the Slovak Republic, Slovenia, Switzerland, Ukraine, and the US have cooperated in MER programs, coming from more than fifty institutions. Thus, scientific doctrines of different universities influenced the development of the MER model which is based on both horizontal and vertical integration of the enterprises' governance and management processes, instruments and institutions into a consistently operating unit. Conclusions: The presented MER model is based on the multi-layer integration of governance and management with an enterprise and its environment, considering the fundamental desires for the enterprises' existence and, thus, their quantitative as well as qualitative changes. The process, instrumental, and institutional integrity of the governance and management is also the initial condition for the implementation of all other integration factors.

  12. epsilon : A tool to find a canonical basis of master integrals

    Science.gov (United States)

    Prausa, Mario

    2017-10-01

    In 2013, Henn proposed a special basis for a certain class of master integrals, which are expressible in terms of iterated integrals. In this basis, the master integrals obey a differential equation, where the right hand side is proportional to ɛ in d = 4 - 2 ɛ space-time dimensions. An algorithmic approach to find such a basis was found by Lee. We present the tool epsilon, an efficient implementation of Lee's algorithm based on the Fermat computer algebra system as computational back end.

  13. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  14. Developing Indicators for a Classroom Observation Tool on Pedagogy and Technology Integration: A Delphi Study

    Science.gov (United States)

    Elmendorf, Douglas C.; Song, Liyan

    2015-01-01

    Rapid advances in technology and increased access to technology tools have created new instructional demands and expectations on teachers. Due to the ubiquitous presence of technology in K-12 schools, teachers are being observed on both their pedagogical and technology integration practices. Applying the technological pedagogical and content…

  15. Integrating Wikis as Educational Tools for the Development of a Community of Inquiry

    Science.gov (United States)

    Eteokleous, Nikleia; Ktoridou, Despo; Orphanou, Maria

    2014-01-01

    This article describes a study that attempted to evaluate the integration of wikis as an educational tool in successfully achieving the learning objectives of a fifth-grade linguistics and literature course. A mixed-method approach was employed--data were collected via questionnaires, reflective journals, observations, and interviews. The results…

  16. Six sigma tools in integrating internal operations of a retail pharmacy: a case study.

    Science.gov (United States)

    Kumar, Sameer; Kwong, Anthony M

    2011-01-01

    This study was initiated to integrate information and enterprise-wide healthcare delivery system issues specifically within an inpatient retail pharmacy operation in a U.S. community hospital. Six Sigma tools were used to examine the effects to an inpatient retail pharmacy service process. Some of the tools used include service blueprints, cause-effect diagram, gap analysis derived from customer and employee surveys, mistake proofing was applied in various business situations and results were analyzed to identify and propose process improvements and integration. The research indicates that the Six Sigma tools in this discussion are very applicable and quite effective in helping to streamline and integrate the pharmacy process flow. Additionally, gap analysis derived from two different surveys was used to estimate the primary areas of focus to increase customer and employee satisfaction. The results of this analysis were useful in initiating discussions of how to effectively narrow these service gaps. This retail pharmaceutical service study serves as a framework for the process that should occur for successful process improvement tool evaluation and implementation. Pharmaceutical Service operations in the U.S. that use this integration framework must tailor it to their individual situations to maximize their chances for success.

  17. The Integration of Digital Tools during Strategic and Interactive Writing Instruction

    Science.gov (United States)

    Kilpatrick, Jennifer Renée; Saulsburry, Rachel; Dostal, Hannah M.; Wolbers, Kimberly A.; Graham, Steve

    2014-01-01

    The purpose of this chapter is to gain insight from the ways a group of elementary teachers of the deaf and hard of hearing chose to integrate digital tools into evidence-based writing instruction and the ways these technologies were used to support student learning. After professional development that exposed these teachers to twelve new digital…

  18. Integrating Social Networking Tools into ESL Writing Classroom: Strengths and Weaknesses

    Science.gov (United States)

    Yunus, Melor Md; Salehi, Hadi; Chenzi, Chen

    2012-01-01

    With the rapid development of world and technology, English learning has become more important. Teachers frequently use teacher-centered pedagogy that leads to lack of interaction with students. This paper aims to investigate the advantages and disadvantages of integrating social networking tools into ESL writing classroom and discuss the ways to…

  19. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  20. Collaborative Digital Games as Mediation Tool to Foster Intercultural Integration in Primary Dutch Schools

    NARCIS (Netherlands)

    A. Paz Alencar (Amanda); T. de la Hera Conde-Pumpido (Teresa)

    2015-01-01

    textabstractIn the Netherlands, the growing presence of immigrant children in schools has fueled scholarly interest in and concerns for examining the process of integration in school environments. The use of digital games has found to be an effective tool to reinforce teaching/learning practices.

  1. Towards an integrated model of international migration

    Directory of Open Access Journals (Sweden)

    Douglas S. MASSEY

    2012-12-01

    Full Text Available Demographers have yet to develop a suitable integrated model of international migration and consequently have been very poor at forecasting immigration. This paper outlines the basic elements of an integrated model and surveys recent history to suggest the key challenges to model construction. A comprehensive theory must explain the structural forces that create a supply of people prone to migrate internationally, the structural origins of labour demand in receiving countries, the motivations of those who respond to these forces by choosing to migrate internationally, the growth and structure of transnational networks that arise to support international movement, the behaviour states in response to immigrant flows, and the influence of state actions on the behaviour of migrants. Recent history suggests that a good model needs to respect the salience of markets, recognize the circularity of migrant flows, appreciate the power of feedback effects, and be alert unanticipated consequences of policy actions.

  2. Quiver gauge theories and integrable lattice models

    International Nuclear Information System (INIS)

    Yagi, Junya

    2015-01-01

    We discuss connections between certain classes of supersymmetric quiver gauge theories and integrable lattice models from the point of view of topological quantum field theories (TQFTs). The relevant classes include 4d N=1 theories known as brane box and brane tilling models, 3d N=2 and 2d N=(2,2) theories obtained from them by compactification, and 2d N=(0,2) theories closely related to these theories. We argue that their supersymmetric indices carry structures of TQFTs equipped with line operators, and as a consequence, are equal to the partition functions of lattice models. The integrability of these models follows from the existence of extra dimension in the TQFTs, which emerges after the theories are embedded in M-theory. The Yang-Baxter equation expresses the invariance of supersymmetric indices under Seiberg duality and its lower-dimensional analogs.

  3. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  4. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  5. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  6. Which coordinate system for modelling path integration?

    Science.gov (United States)

    Vickerstaff, Robert J; Cheung, Allen

    2010-03-21

    Path integration is a navigation strategy widely observed in nature where an animal maintains a running estimate, called the home vector, of its location during an excursion. Evidence suggests it is both ancient and ubiquitous in nature, and has been studied for over a century. In that time, canonical and neural network models have flourished, based on a wide range of assumptions, justifications and supporting data. Despite the importance of the phenomenon, consensus and unifying principles appear lacking. A fundamental issue is the neural representation of space needed for biological path integration. This paper presents a scheme to classify path integration systems on the basis of the way the home vector records and updates the spatial relationship between the animal and its home location. Four extended classes of coordinate systems are used to unify and review both canonical and neural network models of path integration, from the arthropod and mammalian literature. This scheme demonstrates analytical equivalence between models which may otherwise appear unrelated, and distinguishes between models which may superficially appear similar. A thorough analysis is carried out of the equational forms of important facets of path integration including updating, steering, searching and systematic errors, using each of the four coordinate systems. The type of available directional cue, namely allothetic or idiothetic, is also considered. It is shown that on balance, the class of home vectors which includes the geocentric Cartesian coordinate system, appears to be the most robust for biological systems. A key conclusion is that deducing computational structure from behavioural data alone will be difficult or impossible, at least in the absence of an analysis of random errors. Consequently it is likely that further theoretical insights into path integration will require an in-depth study of the effect of noise on the four classes of home vectors. Copyright 2009 Elsevier Ltd

  7. Topological matter, integrable models and fusion rings

    International Nuclear Information System (INIS)

    Nemeschansky, D.; Warner, N.P.

    1992-01-01

    We show how topological G k /G k models can be embedded into the topological matter models that are obtained by perturbing the twisted N = 2 supersymmetric, hermitian symmetric, coset models. In particular, this leads to an embedding of the fusion ring of G as a sub-ring of the perturbed, chiral primary ring. The perturbation of the twisted N = 2 model that leads to the fusion ring is also shown to lead to an integrable N = 2 supersymmetric field theory when the untwisted N = 2 superconformal field theory is perturbed by the same operator and its hermitian conjugate. (orig.)

  8. An Integrated Development Tool for a safety application using FBD language

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jun; Lee, Jang Soo; Lee, Dong Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    Regarding digitalizing the Nuclear Instrumentation and Control Systems, the application program responsible for the safety functions of Nuclear I and C Systems shall ensure the robustness of the safety function through development, testing, and validation roles for a life cycle process during software development. The importance of software in nuclear systems increases continuously. The integrated engineering tools to develop, test, and validate safety application programs require increasingly more complex parts among a number of components within nuclear digital I and C systems. This paper introduces the integrated engineering tool (SafeCASE-PLC) developed by our project. The SafeCASE-PLC is a kind of software engineering tool to develop, test, and validate the nuclear application program performed in an automatic controller

  9. Towards an Integrative Model of Knowledge Transfer

    DEFF Research Database (Denmark)

    Turcan, Romeo V.; Heslop, Ben

    This paper aims to contribute towards the advancement of an efficient architecture of a single market for knowledge through the development of an integrative model of knowledge transfer. Within this aim, several points of departure can be singled out. One, the article builds on the call of the Eu......This paper aims to contribute towards the advancement of an efficient architecture of a single market for knowledge through the development of an integrative model of knowledge transfer. Within this aim, several points of departure can be singled out. One, the article builds on the call...... business and academia, and implementing the respective legislature are enduring. The research objectives were to explore (i) the process of knowledge transfer in universities, including the nature of tensions, obstacles and incentives, (ii) the relationships between key stakeholders in the KT market...... of the emergent integrative model of knowledge transfer. In an attempt to bring it to a higher level of generalizability, the integrative model of KT is further conceptualized from a ‘sociology of markets’ perspective resulting in an emergent architecture of a single market for knowledge. Future research...

  10. International Summit on Integrated Environmental Modeling

    Science.gov (United States)

    This report describes the International Summit on Integrated Environmental Modeling (IEM), held in Washington, DC 7th-9th December 2010. The meeting brought together 57 scientists and managers from leading US and European government and non-governmental organizations, universitie...

  11. Accurate Electromagnetic Modeling Methods for Integrated Circuits

    NARCIS (Netherlands)

    Sheng, Z.

    2010-01-01

    The present development of modern integrated circuits (IC’s) is characterized by a number of critical factors that make their design and verification considerably more difficult than before. This dissertation addresses the important questions of modeling all electromagnetic behavior of features on

  12. Rethinking School Bullying: Towards an Integrated Model

    Science.gov (United States)

    Dixon, Roz; Smith, Peter K.

    2011-01-01

    What would make anti-bullying initiatives more successful? This book offers a new approach to the problem of school bullying. The question of what constitutes a useful theory of bullying is considered and suggestions are made as to how priorities for future research might be identified. The integrated, systemic model of school bullying introduced…

  13. Tools of integration of innovation-oriented machine-building enterprises in industrial park environment

    Directory of Open Access Journals (Sweden)

    К.О. Boiarynova

    2017-08-01

    Full Text Available The research is devoted to the development of the tools for the integration of innovation-oriented mechanical engineering enterprises into the environment of industrial park as functional economic systems, which are capable on the own development basis to provide the development of resident enterprises. The article analyzes the opportunities for the development of mechanical engineering enterprises. The formed structure of the mechanism of integration of mechanical engineering enterprises as functional economic systems into the industrial park environment is based on: 1 the development of participation programs in the industrial park of the mechanical engineering enterprises as an innovation-oriented partner, which foresees the development of the enterprise immediately and the development of other residents; 2 the provision of high-tech equipment of resident enterprises of industrial parks; 3 the creation of subsidiary-spin-out enterprises of large mechanical engineering enterprises for high-tech production in the industrial park. The author proposes the road map that reveals the procedures for the integration and functioning the investigated enterprises through interaction as well as in the ecosystem of the industrial park and in the general ecosystem of functioning, and the tools for providing economic functionality through economic and organizational proceedings at preventive, partner and resident phases of integration. The tools allow the innovation-oriented mechanical engineering enterprises to integrate into such territorial structures as industrial parks, this in complex will allow carrying out their purposes in the development of the real sector of the economy.

  14. Integrated fuel-cycle models for fast breeder reactors

    International Nuclear Information System (INIS)

    Ott, K.O.; Maudlin, P.J.

    1981-01-01

    Breeder-reactor fuel-cycle analysis can be divided into four different areas or categories. The first category concerns questions about the spatial variation of the fuel composition for single loading intervals. Questions of the variations in the fuel composition over several cycles represent a second category. Third, there is a need for a determination of the breeding capability of the reactor. The fourth category concerns the investigation of breeding and long-term fuel logistics. Two fuel-cycle models used to answer questions in the third and fourth area are presented. The space- and time-dependent actinide balance, coupled with criticality and fuel-management constraints, is the basis for both the Discontinuous Integrated Fuel-Cycle Model and the Continuous Integrated Fuel-Cycle Model. The results of the continuous model are compared with results obtained from detailed two-dimensional space and multigroup depletion calculations. The continuous model yields nearly the same results as the detailed calculation, and this is with a comparatively insignificant fraction of the computational effort needed for the detailed calculation. Thus, the integrated model presented is an accurate tool for answering questions concerning reactor breeding capability and long-term fuel logistics. (author)

  15. Data Integration for the Generation of High Resolution Reservoir Models

    Energy Technology Data Exchange (ETDEWEB)

    Albert Reynolds; Dean Oliver; Gaoming Li; Yong Zhao; Chaohui Che; Kai Zhang; Yannong Dong; Chinedu Abgalaka; Mei Han

    2009-01-07

    The goal of this three-year project was to develop a theoretical basis and practical technology for the integration of geologic, production and time-lapse seismic data in a way that makes best use of the information for reservoir description and reservoir performance predictions. The methodology and practical tools for data integration that were developed in this research project have been incorporated into computational algorithms that are feasible for large scale reservoir simulation models. As the integration of production and seismic data require calibrating geological/geostatistical models to these data sets, the main computational tool is an automatic history matching algorithm. The following specific goals were accomplished during this research. (1) We developed algorithms for calibrating the location of the boundaries of geologic facies and the distribution of rock properties so that production and time-lapse seismic data are honored. (2) We developed and implemented specific procedures for conditioning reservoir models to time-lapse seismic data. (3) We developed and implemented algorithms for the characterization of measurement errors which are needed to determine the relative weights of data when conditioning reservoir models to production and time-lapse seismic data by automatic history matching. (4) We developed and implemented algorithms for the adjustment of relative permeability curves during the history matching process. (5) We developed algorithms for production optimization which accounts for geological uncertainty within the context of closed-loop reservoir management. (6) To ensure the research results will lead to practical public tools for independent oil companies, as part of the project we built a graphical user interface for the reservoir simulator and history matching software using Visual Basic.

  16. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  17. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  18. miRQuest: integration of tools on a Web server for microRNA research.

    Science.gov (United States)

    Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R

    2016-03-28

    This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.

  19. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Burr, Tom; Gorensek, M.B.; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclearnonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facilitymodeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facilitymodeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facilitymodelingcapabilities and illustrates how they could be integrated and utilized for nonproliferationanalysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facilitymodeling tools. After considering a representative sampling of key facilitymodelingcapabilities, the proposed integration framework is illustrated with several examples.

  20. Nonlinear integral equations for the sausage model

    Science.gov (United States)

    Ahn, Changrim; Balog, Janos; Ravanini, Francesco

    2017-08-01

    The sausage model, first proposed by Fateev, Onofri, and Zamolodchikov, is a deformation of the O(3) sigma model preserving integrability. The target space is deformed from the sphere to ‘sausage’ shape by a deformation parameter ν. This model is defined by a factorizable S-matrix which is obtained by deforming that of the O(3) sigma model by a parameter λ. Clues for the deformed sigma model are provided by various UV and IR information through the thermodynamic Bethe ansatz (TBA) analysis based on the S-matrix. Application of TBA to the sausage model is, however, limited to the case of 1/λ integer where the coupled integral equations can be truncated to a finite number. In this paper, we propose a finite set of nonlinear integral equations (NLIEs), which are applicable to generic value of λ. Our derivation is based on T-Q relations extracted from the truncated TBA equations. For a consistency check, we compute next-leading order corrections of the vacuum energy and extract the S-matrix information in the IR limit. We also solved the NLIE both analytically and numerically in the UV limit to get the effective central charge and compared with that of the zero-mode dynamics to obtain exact relation between ν and λ. Dedicated to the memory of Petr Petrovich Kulish.

  1. Processing: A Python Framework for the Seamless Integration of Geoprocessing Tools in QGIS

    Directory of Open Access Journals (Sweden)

    Anita Graser

    2015-10-01

    Full Text Available Processing is an object-oriented Python framework for the popular open source Geographic Information System QGIS, which provides a seamless integration of geoprocessing tools from a variety of different software libraries. In this paper, we present the development history, software architecture and features of the Processing framework, which make it a versatile tool for the development of geoprocessing algorithms and workflows, as well as an efficient integration platform for algorithms from different sources. Using real-world application examples, we furthermore illustrate how the Processing architecture enables typical geoprocessing use cases in research and development, such as automating and documenting workflows, combining algorithms from different software libraries, as well as developing and integrating custom algorithms. Finally, we discuss how Processing can facilitate reproducible research and provide an outlook towards future development goals.

  2. Models of Russia's Participation in Regional Economic Integration

    Directory of Open Access Journals (Sweden)

    Darya I. Ushkalova

    2014-01-01

    Full Text Available The article analyses models and mechanisms of Russia's participation in integration processes in Post-Soviet space in recent years. The article examines the model of integration of Customs Union Common Economic Space Eurasian Economic Union and particular mechanisms of its realization. It also examines key challenges to further development of integration in the frameworks of Eurasian Economic Union including exhausting of short-term and medium-term integration effects against a background of low level of economic cooperation and the lack of effective mechanism of interest coordination and decisionmaking similar to qualified majority. It concludes that deterioration of mutual trade dynamics in Customs Union is determined by fundamental factors, first of all, exhausting of medium-term integration effects which lead to extension of mutual trade immediately after Customs Union creation but do not change its qualitative characteristics in long-term outlook. The author shows an absence of significant long-term integration effects which were based on increase of domestic market capacity due to a modification of economic structure. It is founded that appearance of such long-term integration effects is possible only in the context of coalescence of national economies at the microlevel based on development of system of communications between enterprises including intrasectoral industrial cooperation. The article also analyses results of realization of Russia's strategy of interaction with states beyond Eurasian Economic Union based on open regionalism concept. The paper presents recommendation on perfection of tools of integration in and outside Eurasian Economic Union. In particular, creation of system of decentralized organizations is proposed, for the implementation of specific cooperation projects in selected areas, taking into account the multiplier effect of such a "point-aimed" action/

  3. A new assessment model and tool for pediatric nurse practitioners.

    Science.gov (United States)

    Burns, C

    1992-01-01

    This article presents a comprehensive assessment model for pediatric nurse practitioner (PNP) practice that integrates familiar elements of the classical medical history, Gordon's Functional Health Patterns, and developmental fields into one system. This model drives the diagnostic reasoning process toward consideration of a broad range of disease, daily living (nursing diagnosis), and developmental diagnoses, which represents PNP practice better than the medical model does.

  4. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  5. Integrated Surface/subsurface flow modeling in PFLOTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Painter, Scott L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    Understanding soil water, groundwater, and shallow surface water dynamics as an integrated hydrological system is critical for understanding the Earth’s critical zone, the thin outer layer at our planet’s surface where vegetation, soil, rock, and gases interact to regulate the environment. Computational tools that take this view of soil moisture and shallow surface flows as a single integrated system are typically referred to as integrated surface/subsurface hydrology models. We extend the open-source, highly parallel, subsurface flow and reactive transport simulator PFLOTRAN to accommodate surface flows. In contrast to most previous implementations, we do not represent a distinct surface system. Instead, the vertical gradient in hydraulic head at the land surface is neglected, which allows the surface flow system to be eliminated and incorporated directly into the subsurface system. This tight coupling approach leads to a robust capability and also greatly simplifies implementation in existing subsurface simulators such as PFLOTRAN. Successful comparisons to independent numerical solutions build confidence in the approximation and implementation. Example simulations of the Walker Branch and East Fork Poplar Creek watersheds near Oak Ridge, Tennessee demonstrate the robustness of the approach in geometrically complex applications. The lack of a robust integrated surface/subsurface hydrology capability had been a barrier to PFLOTRAN’s use in critical zone studies. This work addresses that capability gap, thus enabling PFLOTRAN as a community platform for building integrated models of the critical zone.

  6. Modeling for Integrated Science Management and Resilient Systems Development

    Science.gov (United States)

    Shelhamer, M.; Mindock, J.; Lumpkins, S.

    2014-01-01

    Many physiological, environmental, and operational risks exist for crewmembers during spaceflight. An understanding of these risks from an integrated perspective is required to provide effective and efficient mitigations during future exploration missions that typically have stringent limitations on resources available, such as mass, power, and crew time. The Human Research Program (HRP) is in the early stages of developing collaborative modeling approaches for the purposes of managing its science portfolio in an integrated manner to support cross-disciplinary risk mitigation strategies and to enable resilient human and engineered systems in the spaceflight environment. In this talk, we will share ideas being explored from fields such as network science, complexity theory, and system-of-systems modeling. Initial work on tools to support these explorations will be discussed briefly, along with ideas for future efforts.

  7. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  8. Mass generation in perturbed massless integrable models

    International Nuclear Information System (INIS)

    Controzzi, D.; Mussardo, G.

    2005-01-01

    We extend form-factor perturbation theory to non-integrable deformations of massless integrable models, in order to address the problem of mass generation in such systems. With respect to the standard renormalisation group analysis this approach is more suitable for studying the particle content of the perturbed theory. Analogously to the massive case, interesting information can be obtained already at first order, such as the identification of the operators which create a mass gap and those which induce the confinement of the massless particles in the perturbed theory

  9. Paradox of integration-A computational model

    Science.gov (United States)

    Krawczyk, Małgorzata J.; Kułakowski, Krzysztof

    2017-02-01

    The paradoxical aspect of integration of a social group has been highlighted by Blau (1964). During the integration process, the group members simultaneously compete for social status and play the role of the audience. Here we show that when the competition prevails over the desire of approval, a sharp transition breaks all friendly relations. However, as was described by Blau, people with high status are inclined to bother more with acceptance of others; this is achieved by praising others and revealing her/his own weak points. In our model, this action smooths the transition and improves interpersonal relations.

  10. Integrated assessment models of global climate change

    International Nuclear Information System (INIS)

    Parson, E.A.; Fisher-Vanden, K.

    1997-01-01

    The authors review recent work in the integrated assessment modeling of global climate change. This field has grown rapidly since 1990. Integrated assessment models seek to combine knowledge from multiple disciplines in formal integrated representations; inform policy-making, structure knowledge, and prioritize key uncertainties; and advance knowledge of broad system linkages and feedbacks, particularly between socio-economic and bio-physical processes. They may combine simplified representations of the socio-economic determinants of greenhouse gas emissions, the atmosphere and oceans, impacts on human activities and ecosystems, and potential policies and responses. The authors summarize current projects, grouping them according to whether they emphasize the dynamics of emissions control and optimal policy-making, uncertainty, or spatial detail. They review the few significant insights that have been claimed from work to date and identify important challenges for integrated assessment modeling in its relationships to disciplinary knowledge and to broader assessment seeking to inform policy- and decision-making. 192 refs., 2 figs

  11. Integrating best evidence into patient care: a process facilitated by a seamless integration with informatics tools.

    Science.gov (United States)

    Giuse, Nunzia B; Williams, Annette M; Giuse, Dario A

    2010-07-01

    The Vanderbilt University paper discusses how the Eskind Biomedical Library at Vanderbilt University Medical Center transitioned from a simplistic approach that linked resources to the institutional electronic medical record system, StarPanel, to a value-added service that is designed to deliver highly relevant information. Clinical teams formulate complex patient-specific questions via an evidence-based medicine literature request basket linked to individual patient records. The paper transitions into discussing how the StarPanel approach acted as a springboard for two additional projects that use highly trained knowledge management librarians with informatics expertise to integrate evidence into both order sets and a patient portal, MyHealth@Vanderbilt.

  12. A System Dynamics Model for Integrated Decision Making ...

    Science.gov (United States)

    EPA’s Sustainable and Healthy Communities Research Program (SHC) is conducting transdisciplinary research to inform and empower decision-makers. EPA tools and approaches are being developed to enable communities to effectively weigh and integrate human health, socioeconomic, environmental, and ecological factors into their decisions to promote community sustainability. To help achieve this goal, EPA researchers have developed systems approaches to account for the linkages among resources, assets, and outcomes managed by a community. System dynamics (SD) is a member of the family of systems approaches and provides a framework for dynamic modeling that can assist with assessing and understanding complex issues across multiple dimensions. To test the utility of such tools when applied to a real-world situation, the EPA has developed a prototype SD model for community sustainability using the proposed Durham-Orange Light Rail Project (D-O LRP) as a case study.The EPA D-O LRP SD modeling team chose the proposed D-O LRP to demonstrate that an integrated modeling approach could represent the multitude of related cross-sectoral decisions that would be made and the cascading impacts that could result from a light rail transit system connecting Durham and Chapel Hill, NC. In keeping with the SHC vision described above, the proposal for the light rail is a starting point solution for the more intractable problems of population growth, unsustainable land use, environmenta

  13. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    African Journals Online (AJOL)

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  14. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  15. COGMIR: A computer model for knowledge integration

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Z.X.

    1988-01-01

    This dissertation explores some aspects of knowledge integration, namely, accumulation of scientific knowledge and performing analogical reasoning on the acquired knowledge. Knowledge to be integrated is conveyed by paragraph-like pieces referred to as documents. By incorporating some results from cognitive science, the Deutsch-Kraft model of information retrieval is extended to a model for knowledge engineering, which integrates acquired knowledge and performs intelligent retrieval. The resulting computer model is termed COGMIR, which stands for a COGnitive Model for Intelligent Retrieval. A scheme, named query invoked memory reorganization, is used in COGMIR for knowledge integration. Unlike some other schemes which realize knowledge integration through subjective understanding by representing new knowledge in terms of existing knowledge, the proposed scheme suggests at storage time only recording the possible connection of knowledge acquired from different documents. The actual binding of the knowledge acquired from different documents is deferred to query time. There is only one way to store knowledge and numerous ways to utilize the knowledge. Each document can be represented as a whole as well as its meaning. In addition, since facts are constructed from the documents, document retrieval and fact retrieval are treated in a unified way. When the requested knowledge is not available, query invoked memory reorganization can generate suggestion based on available knowledge through analogical reasoning. This is done by revising the algorithms developed for document retrieval and fact retrieval, and by incorporating Gentner's structure mapping theory. Analogical reasoning is treated as a natural extension of intelligent retrieval, so that two previously separate research areas are combined. A case study is provided. All the components are implemented as list structures similar to relational data-bases.

  16. Integrated Human Futures Modeling in Egypt

    Energy Technology Data Exchange (ETDEWEB)

    Passell, Howard D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aamir, Munaf Syed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bernard, Michael Lewis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beyeler, Walter E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fellner, Karen Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hayden, Nancy Kay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jeffers, Robert Fredric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Keller, Elizabeth James Kistin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Malczynski, Leonard A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mitchell, Michael David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silver, Emily [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tidwell, Vincent C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Villa, Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Engelke, Peter [Atlantic Council, Washington, D.C. (United States); Burrow, Mat [Atlantic Council, Washington, D.C. (United States); Keith, Bruce [United States Military Academy, West Point, NY (United States)

    2016-01-01

    The Integrated Human Futures Project provides a set of analytical and quantitative modeling and simulation tools that help explore the links among human social, economic, and ecological conditions, human resilience, conflict, and peace, and allows users to simulate tradeoffs and consequences associated with different future development and mitigation scenarios. In the current study, we integrate five distinct modeling platforms to simulate the potential risk of social unrest in Egypt resulting from the Grand Ethiopian Renaissance Dam (GERD) on the Blue Nile in Ethiopia. The five platforms simulate hydrology, agriculture, economy, human ecology, and human psychology/behavior, and show how impacts derived from development initiatives in one sector (e.g., hydrology) might ripple through to affect other sectors and how development and security concerns may be triggered across the region. This approach evaluates potential consequences, intended and unintended, associated with strategic policy actions that span the development-security nexus at the national, regional, and international levels. Model results are not intended to provide explicit predictions, but rather to provide system-level insight for policy makers into the dynamics among these interacting sectors, and to demonstrate an approach to evaluating short- and long-term policy trade-offs across different policy domains and stakeholders. The GERD project is critical to government-planned development efforts in Ethiopia but is expected to reduce downstream freshwater availability in the Nile Basin, fueling fears of negative social and economic impacts that could threaten stability and security in Egypt. We tested these hypotheses and came to the following preliminary conclusions. First, the GERD will have an important short-term impact on water availability, food production, and hydropower production in Egypt, depending on the short- term reservoir fill rate. Second, the GERD will have a very small impact on

  17. The Cryosphere Model Comparison Tool (CmCt): Ice Sheet Model Validation and Comparison Tool for Greenland and Antarctica

    Science.gov (United States)

    Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.

    2017-12-01

    The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.

  18. Progress in integrated energy-economy-environment model system development

    International Nuclear Information System (INIS)

    Yasukawa, Shigeru; Mankin, Shuichi; Sato, Osamu; Tadokoro, Yoshihiro; Nakano, Yasuyuki; Nagano, Takao

    1987-11-01

    The Integrated Energy-Economy-Environment Model System has been developed for providing analytical tools for the system analysis and technology assessments in the field of nuclear research and development. This model system consists of the following four model groups. The first model block installs 5 models and can serve to analyze and generate long-term scenarios on economy-energy-environment evolution. The second model block installs 2 models and can serve to analyze the structural transition phenomena in energy-economy-environment interactions. The third model block installs 2 models and can handle power reactor installation strategy problem and long-term fuel cycle analysis. The fourth model block installs 5 models and codes and can treats cost-benefit-risk analysis and assessments. This report describes mainly the progress and the outlines of application of the model system in these years after the first report on the research and development of the model system (JAERI-M 84 - 139). (author)

  19. Human Ageing Genomic Resources: Integrated databases and tools for the biology and genetics of ageing

    Science.gov (United States)

    Tacutu, Robi; Craig, Thomas; Budovsky, Arie; Wuttke, Daniel; Lehmann, Gilad; Taranukha, Dmitri; Costa, Joana; Fraifeld, Vadim E.; de Magalhães, João Pedro

    2013-01-01

    The Human Ageing Genomic Resources (HAGR, http://genomics.senescence.info) is a freely available online collection of research databases and tools for the biology and genetics of ageing. HAGR features now several databases with high-quality manually curated data: (i) GenAge, a database of genes associated with ageing in humans and model organisms; (ii) AnAge, an extensive collection of longevity records and complementary traits for >4000 vertebrate species; and (iii) GenDR, a newly incorporated database, containing both gene mutations that interfere with dietary restriction-mediated lifespan extension and consistent gene expression changes induced by dietary restriction. Since its creation about 10 years ago, major efforts have been undertaken to maintain the quality of data in HAGR, while further continuing to develop, improve and extend it. This article briefly describes the content of HAGR and details the major updates since its previous publications, in terms of both structure and content. The completely redesigned interface, more intuitive and more integrative of HAGR resources, is also presented. Altogether, we hope that through its improvements, the current version of HAGR will continue to provide users with the most comprehensive and accessible resources available today in the field of biogerontology. PMID:23193293

  20. COMSY - A software tool for PLIM + PLEX with integrated risk-informed approaches

    International Nuclear Information System (INIS)

    Zander, A.; Nopper, H.; Roessner, R.

    2004-01-01

    The majority of mechanical components and structures in a thermal power plant are designed to experience a service life which is far above the intended design life. In most cases, only a small percentage of mechanical components are subject to significant degradation which may affect the integrity or the function of the component. If plant life extension (PLEX) is considered as an option, a plant specific PLIM strategy needs to be developed. One of the most important tasks of such a PLIM strategy is to identify those components which (i) are relevant for the safety and/or availability of the plant and (ii) experience elevated degradation due to their operating and design conditions. For these components special life management strategies need to be established to reliably monitor their condition. FRAMATOME ANP GmbH has developed the software tool COMSY, which is designed to efficiently support a plant-wide lifetime management strategy for static mechanical components, providing the basis for plant life extension (PLEX) activities. The objective is the economical and safe operation of power plants over their design lifetime - and beyond. The tool provides the capability to establish a program guided technical documentation of the plant by utilizing a virtual plant data model. The software integrates engineering analysis functions and comprehensive material libraries to perform a lifetime analysis for various degradation mechanisms typically experienced in power plants (e.g. flow-accelerated corrosion, intergranular stress corrosion cracking, strain-induced cracking, material fatigue, cavitation erosion, droplet impingement erosion, pitting, etc.). A risk-based prioritization serves to focus inspection activities on safety or availability relevant locations, where a degradation potential exists. Trending functions support the comparison of the as-measured condition with the predicted progress of degradation while making allowance for measurement tolerances. The

  1. Toward an Integrative Model of Global Business Strategy

    DEFF Research Database (Denmark)

    Li, Xin

    fragmentation-integration-fragmentation-integration upward spiral. In response to the call for integrative approach to strategic management research, we propose an integrative model of global business strategy that aims at integrating not only strategy and IB but also the different paradigms within the strategy...... field. We also discuss the merit and limitation of our model....

  2. Clarity versus complexity: land-use modeling as a practical tool for decision-makers

    Science.gov (United States)

    Sohl, Terry L.; Claggett, Peter

    2013-01-01

    The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.

  3. Visualization of RNA structure models within the Integrative Genomics Viewer.

    Science.gov (United States)

    Busan, Steven; Weeks, Kevin M

    2017-07-01

    Analyses of the interrelationships between RNA structure and function are increasingly important components of genomic studies. The SHAPE-MaP strategy enables accurate RNA structure probing and realistic structure modeling of kilobase-length noncoding RNAs and mRNAs. Existing tools for visualizing RNA structure models are not suitable for efficient analysis of long, structurally heterogeneous RNAs. In addition, structure models are often advantageously interpreted in the context of other experimental data and gene annotation information, for which few tools currently exist. We have developed a module within the widely used and well supported open-source Integrative Genomics Viewer (IGV) that allows visualization of SHAPE and other chemical probing data, including raw reactivities, data-driven structural entropies, and data-constrained base-pair secondary structure models, in context with linear genomic data tracks. We illustrate the usefulness of visualizing RNA structure in the IGV by exploring structure models for a large viral RNA genome, comparing bacterial mRNA structure in cells with its structure under cell- and protein-free conditions, and comparing a noncoding RNA structure modeled using SHAPE data with a base-pairing model inferred through sequence covariation analysis. © 2017 Busan and Weeks; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  4. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  5. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  6. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  7. An integrated knowledge-based and optimization tool for the sustainable selection of wastewater treatment process concepts

    DEFF Research Database (Denmark)

    Castillo, A.; Cheali, Peam; Gómez, V.

    2016-01-01

    The increasing demand on wastewater treatment plants (WWTPs) has involved an interest in improving the alternative treatment selection process. In this study, an integrated framework including an intelligent knowledge-based system and superstructure-based optimization has been developed and applied...... to a real case study. Hence, a multi-criteria analysis together with mathematical models is applied to generate a ranked short-list of feasible treatments for three different scenarios. Finally, the uncertainty analysis performed allows for increasing the quality and robustness of the decisions considering...... benefit and synergy is achieved when both tools are integrated because expert knowledge and expertise are considered together with mathematical models to select the most appropriate treatment alternative...

  8. OISI dynamic end-to-end modeling tool

    Science.gov (United States)

    Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo

    2000-07-01

    The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.

  9. Integrating data from the Investigational Medicinal Product Dossier/investigator's brochure. A new tool for translational integration of preclinical effects.

    Science.gov (United States)

    van Gerven, Joop; Cohen, Adam

    2018-01-30

    The first administration of a new compound in humans is an important milestone. A major source of information for the researcher is the investigator's brochure (IB). Such a document, has a size of several hundred pages. The IB should enable investigators or regulators to independently assess the risk-benefit of the proposed trial but the size and complexity makes this difficult. This article offers a practical tool for the integration and subsequent communication of the complex information from the IB or other relevant data sources. This paper is accompanied by an accessible software tool to construct a single page colour-coded overview of preclinical and clinical data. © 2018 The British Pharmacological Society.

  10. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Science.gov (United States)

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  11. Testing Predictive Models of Technology Integration in Mexico and the United States

    Science.gov (United States)

    Velazquez, Cesareo Morales

    2008-01-01

    Data from Mexico City, Mexico (N = 978) and from Texas, USA (N = 932) were used to test the predictive validity of the teacher professional development component of the Will, Skill, Tool Model of Technology Integration in a cross-cultural context. Structural equation modeling (SEM) was used to test the model. Analyses of these data yielded…

  12. Integrating satellite imagery with simulation modeling to improve burn severity mapping

    Science.gov (United States)

    Eva C. Karau; Pamela G. Sikkink; Robert E. Keane; Gregory K. Dillon

    2014-01-01

    Both satellite imagery and spatial fire effects models are valuable tools for generating burn severity maps that are useful to fire scientists and resource managers. The purpose of this study was to test a new mapping approach that integrates imagery and modeling to create more accurate burn severity maps. We developed and assessed a statistical model that combines the...

  13. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  14. LEARNING TOOLS INTEROPERABILITY – A NEW STANDARD FOR INTEGRATION OF DISTANCE LEARNING PLATFORMS

    Directory of Open Access Journals (Sweden)

    Oleksandr A. Shcherbyna

    2015-06-01

    Full Text Available For information technology in education there is always an issue of re-usage of electronic educational resources, their transferring possibility from one virtual learning environment to another. Previously, standardized sets of files were used to serve this purpose, for example, SCORM-packages. In this article the new standard Learning Tools Interoperability (LTI is reviewed, which allows users from one environment to access resources from another environment. This makes it possible to integrate them into a single distributed learning environment that is created and shared. The article gives examples of the practical use of standard LTI in Moodle learning management system using External tool and LTI provider plugins.

  15. The Venetian Ghetto: Semantic Modelling for an Integrated Analysis

    Directory of Open Access Journals (Sweden)

    Alessandra Ferrighi

    2017-12-01

    Full Text Available In the digital era, historians are embracing information technology as a research tool. New technologies offer investigation and interpretation, synthesis and communication tools that are more effective than the more traditional study methods, as they guarantee a multidisciplinary approach and analyses integration. Among the available technologies the best suited for the study or urban phenomena are databases (DB, the Geographic Information System (GIS, the Building Information Modelling (BIM and the multimedia tools (Video, APP for the dissemination of results. The case study described here concerns the analysis of part of Venice that changed its appearance from 1516 onwards, with the creation of the Jewish Ghetto. This was an event that would have repercussions throughout Europe, changing the course of history. Our research confirms that the exclusive use of one of the systems mentioned above (DB, GIS, BIM makes it possible to manage the complexity of the subject matter only partially. Consequently, it became necessary to analyse the possible interactions between such tools, so as to create a link between an alphanumeric DB and a geographical DB. The use of only GIS and BIM that provide for a 4D time management of objects turned out to be able to manage information and geometry in an effective and scalable way, providing a starting point for the mapping in depth of the historical analysis. Software products for digital modelling have changed in nature over time, going from simple viewing tools to simulation tools. The reconstruction of the time phases of the three Ghettos (Nuovo, Vecchio, and Nuovissimo and their visualisation through digital narratives of the history of that specific area of the city, for instance through videos, is making it possible for an increasing number of scholars and the general public to access the results of the study.

  16. Contribution to the study of conformal theories and integrable models

    International Nuclear Information System (INIS)

    Sochen, N.

    1992-05-01

    The purpose of this thesis is the 2-D physics study. The main tool is the conformal field theory with Kac-Moody and W algebra. This theory describes the 2-D models that have translation, rotation and dilatation symmetries, at their critical point. The expanded conformal theories describe models that have a larger symmetry than conformal symmetry. After a review of conformal theory methods, the author effects a detailed study of singular vector form in sl(2) affine algebra. With this important form, correlation functions can be calculated. The classical W algebra is studied and the relations between classical W algebra and quantum W algebra are specified. Bosonization method is presented and sl(2)/sl(2) topological model, studied. Partition function bosonization of different models is described. A program of rational theory classification is described linking rational conformal theories and spin integrable models, and interesting relations between Boltzmann weights of different models have been found. With these relations, the integrability of models by a direct calculation of their Boltzmann weights is proved

  17. Model-based sensorimotor integration for multi-joint control: development of a virtual arm model.

    Science.gov (United States)

    Song, D; Lan, N; Loeb, G E; Gordon, J

    2008-06-01

    An integrated, sensorimotor virtual arm (VA) model has been developed and validated for simulation studies of control of human arm movements. Realistic anatomical features of shoulder, elbow and forearm joints were captured with a graphic modeling environment, SIMM. The model included 15 musculotendon elements acting at the shoulder, elbow and forearm. Muscle actions on joints were evaluated by SIMM generated moment arms that were matched to experimentally measured profiles. The Virtual Muscle (VM) model contained appropriate admixture of slow and fast twitch fibers with realistic physiological properties for force production. A realistic spindle model was embedded in each VM with inputs of fascicle length, gamma static (gamma(stat)) and dynamic (gamma(dyn)) controls and outputs of primary (I(a)) and secondary (II) afferents. A piecewise linear model of Golgi Tendon Organ (GTO) represented the ensemble sampling (I(b)) of the total muscle force at the tendon. All model components were integrated into a Simulink block using a special software tool. The complete VA model was validated with open-loop simulation at discrete hand positions within the full range of alpha and gamma drives to extrafusal and intrafusal muscle fibers. The model behaviors were consistent with a wide variety of physiological phenomena. Spindle afferents were effectively modulated by fusimotor drives and hand positions of the arm. These simulations validated the VA model as a computational tool for studying arm movement control. The VA model is available to researchers at website http://pt.usc.edu/cel .

  18. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  19. Mechanisms for integration of information models across related domains

    Science.gov (United States)

    Atkinson, Rob

    2010-05-01

    It is well recognised that there are opportunities and challenges in cross-disciplinary data integration. A significant barrier, however, is creating a conceptual model of the combined domains and the area of integration. For example, a groundwater domain application may require information from several related domains: geology, hydrology, water policy, etc. Each domain may have its own data holdings and conceptual models, but these will share various common concepts (eg. The concept of an aquifer). These areas of semantic overlap present significant challenges, firstly to choose a single representation (model) of a concept that appears in multiple disparate models,, then to harmonise these other models with the single representation. In addition, models may exist at different levels of abstraction depending on how closely aligned they are with a particular implementation. This makes it hard for modellers in one domain to introduce elements from another domain without either introducing a specific style of implementation, or conversely dealing with a set of abstract patterns that are hard to integrate with existing implementations. Models are easier to integrate if they are broken down into small units, with common concepts implemented using common models from well-known, and predictably managed shared libraries. This vision however requires development of a set of mechanisms (tools and procedures) for implementing and exploiting libraries of model components. These mechanisms need to handle publication, discovery, subscription, versioning and implementation of models in different forms. In this presentation a coherent suite of such mechanisms is proposed, using a scenario based on re-use of geosciences models. This approach forms the basis of a comprehensive strategy to empower domain modellers to create more interoperable systems. The strategy address a range of concerns and practice, and includes methodologies, an accessible toolkit, improvements to available

  20. Site descriptive modelling - strategy for integrated evaluation

    International Nuclear Information System (INIS)

    Andersson, Johan

    2003-02-01

    The current document establishes the strategy to be used for achieving sufficient integration between disciplines in producing Site Descriptive Models during the Site Investigation stage. The Site Descriptive Model should be a multidisciplinary interpretation of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and ecosystems using site investigation data from deep bore holes and from the surface as input. The modelling comprise the following iterative steps, evaluation of primary data, descriptive and quantitative modelling (in 3D), overall confidence evaluation. Data are first evaluated within each discipline and then the evaluations are checked between the disciplines. Three-dimensional modelling (i.e. estimating the distribution of parameter values in space and its uncertainty) is made in a sequence, where the geometrical framework is taken from the geological model and in turn used by the rock mechanics, thermal and hydrogeological modelling etc. The three-dimensional description should present the parameters with their spatial variability over a relevant and specified scale, with the uncertainty included in this description. Different alternative descriptions may be required. After the individual discipline modelling and uncertainty assessment a phase of overall confidence evaluation follows. Relevant parts of the different modelling teams assess the suggested uncertainties and evaluate the feedback. These discussions should assess overall confidence by, checking that all relevant data are used, checking that information in past model versions is considered, checking that the different kinds of uncertainty are addressed, checking if suggested alternatives make sense and if there is potential for additional alternatives, and by discussing, if appropriate, how additional measurements (i.e. more data) would affect confidence. The findings as well as the modelling results are to be documented in a Site Description

  1. Integrated Reporting as a Tool for Communicating with Stakeholders - Advantages and Disadvantages

    Science.gov (United States)

    Matuszyk, Iwona; Rymkiewicz, Bartosz

    2018-03-01

    Financial and non-financial reporting from the beginning of its existence is the primary source of communication between the company and a wide range of stakeholders. Over the decades it has adapted to the needs of rapidly changing business and social environment. Currently, the final link in the evolution of organizational reporting, such as integrated reporting, assumes integration and mutual connectivity to both financial and non-financial data. The main interest in the concept of integrated reporting comes from the value it contributes to the organization. Undoubtedly, the concept of integrated reporting is a milestone in the evolution of organizational reporting. It is however important to consider whether it adequately addresses the information needs of a wide range of stakeholders, and whether it is a universal tool for communication between the company and its stakeholders. The aim of the paper is to discuss the advantages and disadvantages of the concept of integrated reporting as a tool for communication with stakeholders and to further directions of its development. The article uses the research methods such as literature analysis, the content analysis of the corporate publications and comparative analysis.

  2. Modeling the dielectric logging tool at high frequency

    International Nuclear Information System (INIS)

    Chew, W.C.

    1987-01-01

    The high frequency dielectric logging tool has been used widely in electromagnetic well logging, because by measuring the dielectric constant at high frequencies (1 GHz), the water saturation of rocks could be known without measuring the water salinity in the rocks. As such, it could be used to delineate fresh water bearing zones, as the dielectric constant of fresh water is much higher than that of oil while they may have the same resistivity. The authors present a computer model, though electromagnetic field analysis, the response of such a measurement tool in a well logging environment. As the measurement is performed at high frequency, usually with small separation between the transmitter and receivers, some small geological features could be measured by such a tool. They use the computer model to study the behavior of such a tool across geological bed boundaries, and also across thin geological beds. Such a study could be very useful in understanding the limitation on the resolution of the tool. Furthermore, they could study the standoff effect and the depth of investigation of such a tool. This could delineate the range of usefulness of the measurement

  3. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  4. Decision support tool to evaluate alternative policies regulating wind integration into autonomous energy systems

    International Nuclear Information System (INIS)

    Zouros, N.; Contaxis, G.C.; Kabouris, J.

    2005-01-01

    Integration of wind power into autonomous electricity systems strongly depends on the specific technical characteristics of these systems; the regulations applied should take into account physical system constraints. Introduction of market rules makes the issue even more complicated since the interests of the market participants often conflict each other. In this paper, an integrated tool for the comparative assessment of alternative regulatory policies is presented along with a methodology for decision-making, based on alternative scenarios analysis. The social welfare concept is followed instead of the traditional Least Cost Planning

  5. An integrated approach using high time-resolved tools to study the origin of aerosols

    International Nuclear Information System (INIS)

    Di Gilio, A.; Gennaro, G. de; Dambruoso, P.; Ventrella, G.

    2015-01-01

    Long-range transport of natural and/or anthropogenic particles can contribute significantly to PM10 and PM2.5 concentrations and some European cities often fail to comply with PM daily limit values due to the additional impact of particles from remote sources. For this reason, reliable methodologies to identify long-range transport (LRT) events would be useful to better understand air pollution phenomena and support proper decision-making. This study explores the potential of an integrated and high time-resolved monitoring approach for the identification and characterization of local, regional and long-range transport events of high PM. In particular, the goal of this work was also the identification of time-limited event. For this purpose, a high time-resolved monitoring campaign was carried out at an urban background site in Bari (southern Italy) for about 20 days (1st–20th October 2011). The integration of collected data as the hourly measurements of inorganic ions in PM 2.5 and their gas precursors and of the natural radioactivity, in addition to the analyses of aerosol maps and hourly back trajectories (BT), provided useful information for the identification and chemical characterization of local sources and trans-boundary intrusions. Non-sea salt (nss) sulfate levels were found to increase when air masses came from northeastern Europe and higher dispersive conditions of the atmosphere were detected. Instead, higher nitrate and lower nss-sulfate concentrations were registered in correspondence with air mass stagnation and attributed to local traffic source. In some cases, combinations of local and trans-boundary sources were observed. Finally, statistical investigations such as the principal component analysis (PCA) applied on hourly ion concentrations and the cluster analyses, the Potential Source Contribution Function (PSCF) and the Concentration Weighted Trajectory (CWT) models computed on hourly back-trajectories enabled to complete a cognitive framework

  6. An integrated approach using high time-resolved tools to study the origin of aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Di Gilio, A. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Gennaro, G. de, E-mail: gianluigi.degennaro@uniba.it [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Dambruoso, P. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Ventrella, G. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy)

    2015-10-15

    Long-range transport of natural and/or anthropogenic particles can contribute significantly to PM10 and PM2.5 concentrations and some European cities often fail to comply with PM daily limit values due to the additional impact of particles from remote sources. For this reason, reliable methodologies to identify long-range transport (LRT) events would be useful to better understand air pollution phenomena and support proper decision-making. This study explores the potential of an integrated and high time-resolved monitoring approach for the identification and characterization of local, regional and long-range transport events of high PM. In particular, the goal of this work was also the identification of time-limited event. For this purpose, a high time-resolved monitoring campaign was carried out at an urban background site in Bari (southern Italy) for about 20 days (1st–20th October 2011). The integration of collected data as the hourly measurements of inorganic ions in PM{sub 2.5} and their gas precursors and of the natural radioactivity, in addition to the analyses of aerosol maps and hourly back trajectories (BT), provided useful information for the identification and chemical characterization of local sources and trans-boundary intrusions. Non-sea salt (nss) sulfate levels were found to increase when air masses came from northeastern Europe and higher dispersive conditions of the atmosphere were detected. Instead, higher nitrate and lower nss-sulfate concentrations were registered in correspondence with air mass stagnation and attributed to local traffic source. In some cases, combinations of local and trans-boundary sources were observed. Finally, statistical investigations such as the principal component analysis (PCA) applied on hourly ion concentrations and the cluster analyses, the Potential Source Contribution Function (PSCF) and the Concentration Weighted Trajectory (CWT) models computed on hourly back-trajectories enabled to complete a cognitive

  7. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  8. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  9. The integrated model of innovative processes management in foreign countries

    Directory of Open Access Journals (Sweden)

    M. T. Kurametova

    2017-01-01

    Full Text Available The formation of an innovative economy must correspond to the promising areas of development of scientific, technical and social progress. To ensure sustainable innovative development of the national economy, it is not only necessary to develop our own tools and mechanisms that are characteristic of the domestic management model, but also the rational use of foreign experience in this field. Analysis of international experience in the use of various tools and mechanisms, management structures for the creation of high-tech and knowledge-based enterprises showed: the integrated nature of innovative development and modernization of the economy is the most sound methodological approach of a phased, systemic transition to new technological structures; When developing tools and mechanisms for innovative development of the economy, one should take into account the actual state of the material and technical base and the existing industrial structure of production, take into account the real possibilities of using different types of resources. The greatest innovation activity is shown by those countries in which the national integrated system effectively provides favorable conditions for the development and introduction of innovations in various spheres of life. International experience in the use of forms of governance can be considered as a mobile system of relations with the real sector of the economy. In the article is given the experience of foreign countries, and examples of adaptation for Kazakhstan integrated models of management of innovative processes to create high-tech enterprises, innovative products which can be competitive in the world market. The author highlighted the role of JSC “Kazakhtelecom” with the widespread provision of public services, having the status of a National operator associated with the provision of the services including long-distance and an international telecommunication for telecommunication networks in General

  10. Integrated Modelling in CRUCIAL Science Education

    Science.gov (United States)

    Mahura, Alexander; Nuterman, Roman; Mukhamedzhanova, Elena; Nerobelov, Georgiy; Sedeeva, Margarita; Suhodskiy, Alexander; Mostamandy, Suleiman; Smyshlyaev, Sergey

    2017-04-01

    The NordForsk CRUCIAL project (2016-2017) "Critical steps in understanding land surface - atmosphere interactions: from improved knowledge to socioeconomic solutions" as a part of the Pan-Eurasian EXperiment (PEEX; https://www.atm.helsinki.fi/peex) programme activities, is looking for a deeper collaboration between Nordic-Russian science communities. In particular, following collaboration between Danish and Russian partners, several topics were selected for joint research and are focused on evaluation of: (1) urbanization processes impact on changes in urban weather and climate on urban-subregional-regional scales and at contribution to assessment studies for population and environment; (2) effects of various feedback mechanisms on aerosol and cloud formation and radiative forcing on urban-regional scales for better predicting extreme weather events and at contribution to early warning systems, (3) environmental contamination from continues emissions and industrial accidents for better assessment and decision making for sustainable social and economic development, and (4) climatology of atmospheric boundary layer in northern latitudes to improve understanding of processes, revising parameterizations, and better weather forecasting. These research topics are realized employing the online integrated Enviro-HIRLAM (Environment - High Resolution Limited Area Model) model within students' research projects: (1) "Online integrated high-resolution modelling of Saint-Petersburg metropolitan area influence on weather and air pollution forecasting"; (2) "Modeling of aerosol impact on regional-urban scales: case study of Saint-Petersburg metropolitan area"; (3) "Regional modeling and GIS evaluation of environmental pollution from Kola Peninsula sources"; and (4) "Climatology of the High-Latitude Planetary Boundary Layer". The students' projects achieved results and planned young scientists research training on online integrated modelling (Jun 2017) will be presented and

  11. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, "Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems" for the period 2005 to 2007. The main activity is semi......-annual workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops......, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...

  12. Integrating microbial diversity in soil carbon dynamic models parameters

    Science.gov (United States)

    Louis, Benjamin; Menasseri-Aubry, Safya; Leterme, Philippe; Maron, Pierre-Alain; Viaud, Valérie

    2015-04-01

    sampling time in order to follow the dynamic of residue and soil organic matter mineralization. Diversity, structure and composition of microbial communities have been characterized before incubation time. The dynamic of carbon fluxes through CO2 emissions has been modelled through a simple model. Using statistical tools, relations between parameters of the model and microbial diversity indexes and/or pedological characteristics have been developed and integrated to the model. First results show that global diversity has an impact on the models parameters. Moreover, larger fungi diversity seems to lead to larger parameters representing decomposition rates and/or carbon use efficiencies than bacterial diversity. Classically, pedological factors such as soil pH and texture must also be taken into account.

  13. Integrating declarative knowledge programming styles and tools for building expert systems

    Energy Technology Data Exchange (ETDEWEB)

    Barbuceanu, M; Trausan-Matu, S; Molnar, B

    1987-01-01

    The XRL system reported in this paper is an integrated knowledge programming environment whose major research theme is the investigation of declarative knowledge programming styles and features and of the way they can be effectively integrated and used to support AI programming. This investigation is carried out in the context of the structured-object representation paradigm which provides the glue keeping XRL components together. The paper describes several declarative programming styles and associated support tools available in XRL. These include an instantiation system supporting a generalized view of the ubiquous frame installation process, a description based programming system providing a novel declarative programming style which embeds a mathematical oriented description language in the structured object environment and a transformational interpreter for using it, a semantics oriented programming framework which offers a specific semantic construct based approach supporting maintenance and evolution and a self description and self generation tool which applies the latter approach to XRL itself. 29 refs., 16 figs.

  14. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  15. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  16. Designing tools for oil exploration using nuclear modeling

    Science.gov (United States)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  17. The 'cube' meta-model for the information system of large health sector organizations--a (platform neutral) mapping tool to integrate information system development with changing business functions and organizational development.

    Science.gov (United States)

    Balkányi, László

    2002-01-01

    To develop information systems (IS) in the changing environment of the health sector, a simple but throughout model, avoiding the techno-jargon of informatics, might be useful for the top management. A platform neutral, extensible, transparent conceptual model should be established. Limitations of current methods lead to a simple, but comprehensive mapping, in the form of a three-dimensional cube. The three 'orthogonal' views are (a) organization functionality, (b) organizational structures and (c) information technology. Each of the cube-sides is described according to its nature. This approach enables to define any kind of an IS component as a certain point/layer/domain of the cube and enables also the management to label all IS components independently form any supplier(s) and/or any specific platform. The model handles changes in organization structure, business functionality and the serving info-system independently form each other. Practical application extends to (a) planning complex, new ISs, (b) guiding development of multi-vendor, multi-site ISs, (c) supporting large-scale public procurement procedures and the contracting, implementation phase by establishing a platform neutral reference, (d) keeping an exhaustive inventory of an existing large-scale system, that handles non-tangible aspects of the IS.

  18. Non-communicable diseases and HIV care and treatment: models of integrated service delivery.

    Science.gov (United States)

    Duffy, Malia; Ojikutu, Bisola; Andrian, Soa; Sohng, Elaine; Minior, Thomas; Hirschhorn, Lisa R

    2017-08-01

    Non-communicable diseases (NCD) are a growing cause of morbidity in low-income countries including in people living with human immunodeficiency virus (HIV). Integration of NCD and HIV services can build upon experience with chronic care models from HIV programmes. We describe models of NCD and HIV integration, challenges and lessons learned. A literature review of published articles on integrated NCD and HIV programs in low-income countries and key informant interviews were conducted with leaders of identified integrated NCD and HIV programs. Information was synthesised to identify models of NCD and HIV service delivery integration. Three models of integration were identified as follows: NCD services integrated into centres originally providing HIV care; HIV care integrated into primary health care (PHC) already offering NCD services; and simultaneous introduction of integrated HIV and NCD services. Major challenges identified included NCD supply chain, human resources, referral systems, patient education, stigma, patient records and monitoring and evaluation. The range of HIV and NCD services varied widely within and across models. Regardless of model of integration, leveraging experience from HIV care models and adapting existing systems and tools is a feasible method to provide efficient care and treatment for the growing numbers of patients with NCDs. Operational research should be conducted to further study how successful models of HIV and NCD integration can be expanded in scope and scaled-up by managers and policymakers seeking to address all the chronic care needs of their patients. © 2017 John Wiley & Sons Ltd.

  19. Integrated Model for E-Learning Acceptance

    Science.gov (United States)

    Ramadiani; Rodziah, A.; Hasan, S. M.; Rusli, A.; Noraini, C.

    2016-01-01

    E-learning is not going to work if the system is not used in accordance with user needs. User Interface is very important to encourage using the application. Many theories had discuss about user interface usability evaluation and technology acceptance separately, actually why we do not make it correlation between interface usability evaluation and user acceptance to enhance e-learning process. Therefore, the evaluation model for e-learning interface acceptance is considered important to investigate. The aim of this study is to propose the integrated e-learning user interface acceptance evaluation model. This model was combined some theories of e-learning interface measurement such as, user learning style, usability evaluation, and the user benefit. We formulated in constructive questionnaires which were shared at 125 English Language School (ELS) students. This research statistics used Structural Equation Model using LISREL v8.80 and MANOVA analysis.

  20. MODELS OF TECHNOLOGY ADOPTION: AN INTEGRATIVE APPROACH

    Directory of Open Access Journals (Sweden)

    Andrei OGREZEANU

    2015-06-01

    Full Text Available The interdisciplinary study of information technology adoption has developed rapidly over the last 30 years. Various theoretical models have been developed and applied such as: the Technology Acceptance Model (TAM, Innovation Diffusion Theory (IDT, Theory of Planned Behavior (TPB, etc. The result of these many years of research is thousands of contributions to the field, which, however, remain highly fragmented. This paper develops a theoretical model of technology adoption by integrating major theories in the field: primarily IDT, TAM, and TPB. To do so while avoiding mess, an approach that goes back to basics in independent variable type’s development is proposed; emphasizing: 1 the logic of classification, and 2 psychological mechanisms behind variable types. Once developed these types are then populated with variables originating in empirical research. Conclusions are developed on which types are underpopulated and present potential for future research. I end with a set of methodological recommendations for future application of the model.