WorldWideScience

Sample records for modeling framework version

  1. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  2. A framework for expanding aqueous chemistry in the Community Multiscale Air Quality (CMAQ) model version 5.1

    Science.gov (United States)

    Fahey, Kathleen M.; Carlton, Annmarie G.; Pye, Havala O. T.; Baek, Jaemeen; Hutzell, William T.; Stanier, Charles O.; Baker, Kirk R.; Wyat Appel, K.; Jaoui, Mohammed; Offenberg, John H.

    2017-04-01

    This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM - KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosenbrock solver (Rodas3) to integrate the stiff system of ordinary differential equations (ODEs) that describe the mass transfer, chemical kinetics, and scavenging processes of CMAQ clouds. CMAQ's standard cloud chemistry module (AQCHEM) is structurally limited to the treatment of a simple chemical mechanism. This work advances our ability to test and implement more sophisticated aqueous chemical mechanisms in CMAQ and further investigate the impacts of microphysical parameters on cloud chemistry. Box model cloud chemistry simulations were performed to choose efficient solver and tolerance settings, evaluate the implementation of the KPP solver, and assess the direct impacts of alternative solver and kinetic mass transfer on predicted concentrations for a range of scenarios. Month-long CMAQ simulations for winter and summer periods over the US reveal the changes in model predictions due to these cloud module updates within the full chemical transport model. While monthly average CMAQ predictions are not drastically altered between AQCHEM and AQCHEM - KMT, hourly concentration differences can be significant. With added in-cloud secondary organic aerosol (SOA) formation from biogenic epoxides (AQCHEM - KMTI), normalized mean error and bias statistics are slightly improved for 2-methyltetrols and 2-methylglyceric acid at the Research Triangle Park measurement site in North Carolina during the Southern Oxidant and Aerosol Study (SOAS) period. The added in-cloud chemistry leads to a monthly average increase of 11-18 % in cloud SOA at the surface in the eastern United States for June 2013.

  3. WeBCMD: A cross-platform interface for the BCMD modelling framework [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Joshua Russell-Buckland

    2017-07-01

    Full Text Available Multimodal monitoring of the brain generates a great quantity of data, providing the potential for great insight into both healthy and injured cerebral dynamics. In particular, near-infrared spectroscopy can be used to measure various physiological variables of interest, such as haemoglobin oxygenation and the redox state of cytochrome-c-oxidase, alongside systemic signals, such as blood pressure. Interpreting these measurements is a complex endeavour, and much work has been done to develop mathematical models that can help to provide understanding of the underlying processes that contribute to the overall dynamics. BCMD is a software framework that was developed to run such models. However, obtaining, installing and running this software is no simple task. Here we present WeBCMD, an online environment that attempts to make the process simpler and much more accessible. By leveraging modern web technologies, an extensible and cross-platform package has been created that can also be accessed remotely from the cloud. WeBCMD is available as a Docker image and an online service.

  4. The Gaia Framework: Version Support In Web Based Open Hypermedia

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kejser, Thomas

    2004-01-01

    The GAIA framework prototype, described herein, explores the possibilities and problems that arise when combining versioning and open hypermedia paradigms. It will be argued that it - by adding versioning as a separate service in the hypermedia architecture – is possible to build consistent...... versioning field and GAIA is compared with previous attempts at defining hypermedia versioning frameworks. GAIA is capable of multi-level versioning and versioning of structures and supports freezing mechanisms for both documents and hyperstructure. The experiences from GAIA provide an input to new reference...

  5. The Gaia Framework: Version Support In Web Based Open Hypermedia

    DEFF Research Database (Denmark)

    Kejser, Thomas; Grønbæk, Kaj

    2003-01-01

    The GAIA framework prototype, described herein, explores the possibilities and problems that arise when combining versioning and open hypermedia paradigms. It will be argued that it - by adding versioning as a separate service in the hypermedia architecture - is possible to build consistent...... versioning field and GAIA is compared with previous attempts at defining hypermedia versioning frameworks. GAIA is capable of multi-level versioning and versioning of structures and supports freezing mechanisms for both documents and hyperstructure. The experiences from GAIA provide an input to new reference...

  6. An integrated assessment modeling framework for uncertainty studies in global and regional climate change: the MIT IGSM-CAM (version 1.0)

    Science.gov (United States)

    Monier, E.; Scott, J. R.; Sokolov, A. P.; Forest, C. E.; Schlosser, C. A.

    2013-12-01

    This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) incorporates a human activity model, it is possible to analyze uncertainties in emissions resulting from both uncertainties in the underlying socio-economic characteristics of the economic model and in the choice of climate-related policies. Another major feature is the flexibility to vary key climate parameters controlling the climate system response to changes in greenhouse gases and aerosols concentrations, e.g., climate sensitivity, ocean heat uptake rate, and strength of the aerosol forcing. The IGSM-CAM is not only able to realistically simulate the present-day mean climate and the observed trends at the global and continental scale, but it also simulates ENSO variability with realistic time scales, seasonality and patterns of SST anomalies, albeit with stronger magnitudes than observed. The IGSM-CAM shares the same general strengths and limitations as the Coupled Model Intercomparison Project Phase 3 (CMIP3) models in simulating present-day annual mean surface temperature and precipitation. Over land, the IGSM-CAM shows similar biases to the NCAR Community Climate System Model (CCSM) version 3, which shares the same atmospheric model. This study also presents 21st century simulations based on two emissions scenarios (unconstrained scenario and stabilization scenario at 660 ppm CO2-equivalent) similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios, and three sets of climate parameters. Results of the simulations with the chosen

  7. Geologic Framework Model (GFM2000)

    International Nuclear Information System (INIS)

    T. Vogt

    2004-01-01

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M and O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in

  8. Geologic Framework Model (GFM2000)

    Energy Technology Data Exchange (ETDEWEB)

    T. Vogt

    2004-08-26

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M&O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in the

  9. Coastal Modelling Environment version 1.0: a framework for integrating landform-specific component models in order to simulate decadal to centennial morphological changes on complex coasts

    Directory of Open Access Journals (Sweden)

    A. Payo

    2017-07-01

    Full Text Available The ability to model morphological changes on complex, multi-landform coasts over decadal to centennial timescales is essential for sustainable coastal management worldwide. One approach involves coupling of landform-specific simulation models (e.g. cliffs, beaches, dunes and estuaries that have been independently developed. An alternative, novel approach explored in this paper is to capture the essential characteristics of the landform-specific models using a common spatial representation within an appropriate software framework. This avoid the problems that result from the model-coupling approach due to between-model differences in the conceptualizations of geometries, volumes and locations of sediment. In the proposed framework, the Coastal Modelling Environment (CoastalME, change in coastal morphology is represented by means of dynamically linked raster and geometrical objects. A grid of raster cells provides the data structure for representing quasi-3-D spatial heterogeneity and sediment conservation. Other geometrical objects (lines, areas and volumes that are consistent with, and derived from, the raster structure represent a library of coastal elements (e.g. shoreline, beach profiles and estuary volumes as required by different landform-specific models. As a proof-of-concept, we illustrate the capabilities of an initial version of CoastalME by integrating a cliff–beach model and two wave propagation approaches. We verify that CoastalME can reproduce behaviours of the component landform-specific models. Additionally, the integration of these component models within the CoastalME framework reveals behaviours that emerge from the interaction of landforms, which have not previously been captured, such as the influence of the regional bathymetry on the local alongshore sediment-transport gradient and the effect on coastal change on an undefended coastal segment and on sediment bypassing of coastal structures.

  10. The Unified Extensional Versioning Model

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred; Christensen, H. B.

    1999-01-01

    Versioning of components in a system is a well-researched field where various adequate techniques have already been established. In this paper, we look at how versioning can be extended to cover also the structural aspects of a system. There exist two basic techniques for versioning - intentional...

  11. Versions of the Waste Reduction Model (WARM)

    Science.gov (United States)

    This page provides a brief chronology of changes made to EPA’s Waste Reduction Model (WARM), organized by WARM version number. The page includes brief summaries of changes and updates since the previous version.

  12. Data for GMD article "A framework for expanding aqueous chemistry in the Community Multiscale Air Quality (CMAQ) model version 5.1"

    Data.gov (United States)

    U.S. Environmental Protection Agency — These data were used to generate the figures included in the following manuscript: Fahey, et al. (2017) "A framework for expanding aqueous chemistry in the Community...

  13. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  14. Geologic Framework Model Analysis Model Report

    International Nuclear Information System (INIS)

    Clayton, R.

    2000-01-01

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and

  15. Solar Advisor Model User Guide for Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Gilman, P.; Blair, N.; Mehos, M.; Christensen, C.; Janzou, S.; Cameron, C.

    2008-08-01

    The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.

  16. Crystallization Kinetics within a Generic Modeling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist V.

    2014-01-01

    of employing a well-structured model library for storage, use/reuse, and analysis of the kinetic models are highlighted. Examples illustrating the application of the modeling framework for kinetic model discrimination related to simulation of specific crystallization scenarios and for kinetic model parameter......A new and extended version of a generic modeling framework for analysis and design of crystallization operations is presented. The new features of this framework are described, with focus on development, implementation, identification, and analysis of crystallization kinetic models. Issues related...... to the modeling of various kinetic phenomena like nucleation, growth, agglomeration, and breakage are discussed in terms of model forms, model parameters, their availability and/or estimation, and their selection and application for specific crystallization operational scenarios under study. The advantages...

  17. Simpevarp - site descriptive model version 0

    International Nuclear Information System (INIS)

    2002-11-01

    During 2002, SKB is starting detailed investigations at two potential sites for a deep repository in the Precambrian rocks of the Fennoscandian Shield. The present report concerns one of those sites, Simpevarp, which lies in the municipality of Oskarshamn, on the southeast coast of Sweden, about 250 kilometres south of Stockholm. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. SKB maintains two main databases at the present time, a site characterisation database called SICADA and a geographic information system called SKB GIS. The site descriptive model will be developed and presented with the aid of the SKB GIS capabilities, and with SKBs Rock Visualisation System (RVS), which is also linked to SICADA. The version 0 model forms an important framework for subsequent model versions, which are developed successively, as new information from the site investigations becomes available. Version 0 is developed out of the information available at the start of the site investigation. In the case of Simpevarp, this is essentially the information which was compiled for the Oskarshamn feasibility study, which led to the choice of that area as a favourable object for further study, together with information collected since its completion. This information, with the exception of the extensive data base from the nearby Aespoe Hard Rock Laboratory, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. Against this background, the present report consists of the following components: an overview of the present content of the databases

  18. NetMOD Version 2.0 Mathematical Framework

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Young, Christopher J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chael, Eric P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    NetMOD ( Net work M onitoring for O ptimal D etection) is a Java-based software package for conducting simulation of seismic, hydroacoustic and infrasonic networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed at each of the stations. From these signal-to-noise ratios (SNR), the probabilities of signal detection at each station and event detection across the network of stations can be computed given a detection threshold. The purpose of this document is to clearly and comprehensively present the mathematical framework used by NetMOD, the software package developed by Sandia National Laboratories to assess the monitoring capability of ground-based sensor networks. Many of the NetMOD equations used for simulations are inherited from the NetSim network capability assessment package developed in the late 1980s by SAIC (Sereno et al., 1990).

  19. Simpevarp - site descriptive model version 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-11-01

    During 2002, SKB is starting detailed investigations at two potential sites for a deep repository in the Precambrian rocks of the Fennoscandian Shield. The present report concerns one of those sites, Simpevarp, which lies in the municipality of Oskarshamn, on the southeast coast of Sweden, about 250 kilometres south of Stockholm. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. SKB maintains two main databases at the present time, a site characterisation database called SICADA and a geographic information system called SKB GIS. The site descriptive model will be developed and presented with the aid of the SKB GIS capabilities, and with SKBs Rock Visualisation System (RVS), which is also linked to SICADA. The version 0 model forms an important framework for subsequent model versions, which are developed successively, as new information from the site investigations becomes available. Version 0 is developed out of the information available at the start of the site investigation. In the case of Simpevarp, this is essentially the information which was compiled for the Oskarshamn feasibility study, which led to the choice of that area as a favourable object for further study, together with information collected since its completion. This information, with the exception of the extensive data base from the nearby Aespoe Hard Rock Laboratory, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. Against this background, the present report consists of the following components: an overview of the present content of the databases

  20. Web application development with Laravel PHP Framework version 4

    OpenAIRE

    Armel, Jamal

    2014-01-01

    The purpose of this thesis work was to learn a new PHP framework and use it efficiently to build an eCommerce web application for a small start-up freelancing company that will let potential customers check products by category and pass orders securely. To fulfil this set of requirements, a system consisting of a web application with a backend was designed and implemented using built in Laravel features such as Composer, Eloquent, Blade and Artisan and a WAMP stack. The web application wa...

  1. Modeling report of DYMOND code (DUPIC version)

    International Nuclear Information System (INIS)

    Park, Joo Hwan; Yacout, Abdellatif M.

    2003-04-01

    The DYMOND code employs the ITHINK dynamic modeling platform to assess the 100-year dynamic evolution scenarios for postulated global nuclear energy parks. Firstly, DYMOND code has been developed by ANL(Argonne National Laboratory) to perform the fuel cycle analysis of LWR once-through and LWR-FBR mixed plant. Since the extensive application of DYMOND code has been requested, the first version of DYMOND has been modified to adapt the DUPIC, MSR and RTF fuel cycle. DYMOND code is composed of three parts; the source language platform, input supply and output. But those platforms are not clearly distinguished. This report described all the equations which were modeled in the modified DYMOND code (which is called as DYMOND-DUPIC version). It divided into five parts;Part A deals model in reactor history which is included amount of the requested fuels and spent fuels. Part B aims to describe model of fuel cycle about fuel flow from the beginning to the end of fuel cycle. Part C is for model in re-processing which is included recovery of burned uranium, plutonium, minor actinide and fission product as well as the amount of spent fuels in storage and disposal. Part D is for model in other fuel cycle which is considered the thorium fuel cycle for MSR and RTF reactor. Part E is for model in economics. This part gives all the information of cost such as uranium mining cost, reactor operating cost, fuel cost etc

  2. Modeling report of DYMOND code (DUPIC version)

    Energy Technology Data Exchange (ETDEWEB)

    Park, Joo Hwan [KAERI, Taejon (Korea, Republic of); Yacout, Abdellatif M [Argonne National Laboratory, Ilinois (United States)

    2003-04-01

    The DYMOND code employs the ITHINK dynamic modeling platform to assess the 100-year dynamic evolution scenarios for postulated global nuclear energy parks. Firstly, DYMOND code has been developed by ANL(Argonne National Laboratory) to perform the fuel cycle analysis of LWR once-through and LWR-FBR mixed plant. Since the extensive application of DYMOND code has been requested, the first version of DYMOND has been modified to adapt the DUPIC, MSR and RTF fuel cycle. DYMOND code is composed of three parts; the source language platform, input supply and output. But those platforms are not clearly distinguished. This report described all the equations which were modeled in the modified DYMOND code (which is called as DYMOND-DUPIC version). It divided into five parts;Part A deals model in reactor history which is included amount of the requested fuels and spent fuels. Part B aims to describe model of fuel cycle about fuel flow from the beginning to the end of fuel cycle. Part C is for model in re-processing which is included recovery of burned uranium, plutonium, minor actinide and fission product as well as the amount of spent fuels in storage and disposal. Part D is for model in other fuel cycle which is considered the thorium fuel cycle for MSR and RTF reactor. Part E is for model in economics. This part gives all the information of cost such as uranium mining cost, reactor operating cost, fuel cost etc.

  3. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  4. CMAQ Model Evaluation Framework

    Science.gov (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  5. Forsmark - site descriptive model version 0

    International Nuclear Information System (INIS)

    2002-10-01

    During 2002, the Swedish Nuclear Fuel and Waste Management Company (SKB) is starting investigations at two potential sites for a deep repository in the Precambrian basement of the Fennoscandian Shield. The present report concerns one of those sites, Forsmark, which lies in the municipality of Oesthammar, on the east coast of Sweden, about 150 kilometres north of Stockholm. The site description should present all collected data and interpreted parameters of importance for the overall scientific understanding of the site, for the technical design and environmental impact assessment of the deep repository, and for the assessment of long-term safety. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. The site descriptive models are devised and stepwise updated as the site investigations proceed. The point of departure for this process is the regional site descriptive model, version 0, which is the subject of the present report. Version 0 is developed out of the information available at the start of the site investigation. This information, with the exception of data from tunnels and drill holes at the sites of the Forsmark nuclear reactors and the underground low-middle active radioactive waste storage facility, SFR, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. For this reason, the Forsmark site descriptive model, version 0, as detailed in the present report, has been developed at a regional scale. It covers a rectangular area, 15 km in a southwest-northeast and 11 km in a northwest-southeast direction, around the

  6. A Framework for Video Modeling

    NARCIS (Netherlands)

    Petkovic, M.; Jonker, Willem

    In recent years, research in video databases has increased greatly, but relatively little work has been done in the area of semantic content-based retrieval. In this paper, we present a framework for video modelling with emphasis on semantic content of video data. The video data model presented

  7. Forsmark - site descriptive model version 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-10-01

    During 2002, the Swedish Nuclear Fuel and Waste Management Company (SKB) is starting investigations at two potential sites for a deep repository in the Precambrian basement of the Fennoscandian Shield. The present report concerns one of those sites, Forsmark, which lies in the municipality of Oesthammar, on the east coast of Sweden, about 150 kilometres north of Stockholm. The site description should present all collected data and interpreted parameters of importance for the overall scientific understanding of the site, for the technical design and environmental impact assessment of the deep repository, and for the assessment of long-term safety. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. The site descriptive models are devised and stepwise updated as the site investigations proceed. The point of departure for this process is the regional site descriptive model, version 0, which is the subject of the present report. Version 0 is developed out of the information available at the start of the site investigation. This information, with the exception of data from tunnels and drill holes at the sites of the Forsmark nuclear reactors and the underground low-middle active radioactive waste storage facility, SFR, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. For this reason, the Forsmark site descriptive model, version 0, as detailed in the present report, has been developed at a regional scale. It covers a rectangular area, 15 km in a southwest-northeast and 11 km in a northwest-southeast direction, around the

  8. A framework for Controlled Human Infection Model (CHIM studies in Malawi: Report of a Wellcome Trust workshop on CHIM in Low Income Countries held in Blantyre, Malawi [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Stephen B Gordon

    2017-08-01

    Full Text Available Controlled human infection model (CHIM studies have pivotal importance in vaccine development, being useful for proof of concept, pathogenesis, down-selection and immunogenicity studies.  To date, however, they have seldom been carried out in low and middle income countries (LMIC, which is where the greatest burden of vaccine preventable illness is found.  This workshop discussed the benefits and barriers to CHIM studies in Malawi.  Benefits include improved vaccine effectiveness and host country capacity development in clinical, laboratory and governance domains.  Barriers include acceptability, safety and regulatory issues. The report suggests a framework by which ethical, laboratory, scientific and governance issues may be addressed by investigators considering or planning CHIM in LMIC.

  9. A new version of code Java for 3D simulation of the CCA model

    Science.gov (United States)

    Zhang, Kebo; Xiong, Hailing; Li, Chao

    2016-07-01

    In this paper we present a new version of the program of CCA model. In order to benefit from the advantages involved in the latest technologies, we migrated the running environment from JDK1.6 to JDK1.7. And the old program was optimized into a new framework, so promoted extendibility.

  10. Version control of pathway models using XML patches.

    Science.gov (United States)

    Saffrey, Peter; Orton, Richard

    2009-03-17

    Computational modelling has become an important tool in understanding biological systems such as signalling pathways. With an increase in size complexity of models comes a need for techniques to manage model versions and their relationship to one another. Model version control for pathway models shares some of the features of software version control but has a number of differences that warrant a specific solution. We present a model version control method, along with a prototype implementation, based on XML patches. We show its application to the EGF/RAS/RAF pathway. Our method allows quick and convenient storage of a wide range of model variations and enables a thorough explanation of these variations. Trying to produce these results without such methods results in slow and cumbersome development that is prone to frustration and human error.

  11. Crystallization Kinetics within a Generic Modelling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist

    2013-01-01

    An existing generic modelling framework has been expanded with tools for kinetic model analysis. The analysis of kinetics is carried out within the framework where kinetic constitutive models are collected, analysed and utilized for the simulation of crystallization operations. A modelling...... procedure is proposed to gain the information of crystallization operation kinetic model analysis and utilize this for faster evaluation of crystallization operations....

  12. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    Science.gov (United States)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic

  13. Model Adequacy Analysis of Matching Record Versions in Nosql Databases

    Directory of Open Access Journals (Sweden)

    E. V. Tsviashchenko

    2015-01-01

    Full Text Available The article investigates a model of matching record versions. The goal of this work is to analyse the model adequacy. This model allows estimating a user’s processing time distribution of the record versions and a distribution of the record versions count. The second option of the model was used, according to which, for a client the time to process record versions depends explicitly on the number of updates, performed by the other users between the sequential updates performed by a current client. In order to prove the model adequacy the real experiment was conducted in the cloud cluster. The cluster contains 10 virtual nodes, provided by DigitalOcean Company. The Ubuntu Server 14.04 was used as an operating system (OS. The NoSQL system Riak was chosen for experiments. In the Riak 2.0 version and later provide “dotted vector versions” (DVV option, which is an extension of the classic vector clock. Their use guarantees, that the versions count, simultaneously stored in DB, will not exceed the count of clients, operating in parallel with a record. This is very important while conducting experiments. For developing the application the java library, provided by Riak, was used. The processes run directly on the nodes. In experiment two records were used. They are: Z – the record, versions of which are handled by clients; RZ – service record, which contains record update counters. The application algorithm can be briefly described as follows: every client reads versions of the record Z, processes its updates using the RZ record counters, and saves treated record in database while old versions are deleted form DB. Then, a client rereads the RZ record and increments counters of updates for the other clients. After that, a client rereads the Z record, saves necessary statistics, and deliberates the results of processing. In the case of emerging conflict because of simultaneous updates of the RZ record, the client obtains all versions of that

  14. Towards New Empirical Versions of Financial and Accounting Models Corrected for Measurement Errors

    OpenAIRE

    Francois-Éric Racicot; Raymond Théoret; Alain Coen

    2006-01-01

    In this paper, we propose a new empirical version of the Fama and French Model based on the Hausman (1978) specification test and aimed at discarding measurement errors in the variables. The proposed empirical framework is general enough to be used for correcting other financial and accounting models of measurement errors. Removing measurement errors is important at many levels as information disclosure, corporate governance and protection of investors.

  15. An Integrated Framework to Specify Domain-Specific Modeling Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2018-01-01

    , a logic-based specification language. The drawback of MS DSL Tools is it does not provide a formal and rigorous approach for semantics specifications. In this framework, we use Microsoft DSL Tools to define the metamodel and graphical notations of DSLs, and an extended version of ForSpec as a formal......In this paper, we propose an integrated framework that can be used by DSL designers to implement their desired graphical domain-specific languages. This framework relies on Microsoft DSL Tools, a meta-modeling framework to build graphical domain-specific languages, and an extension of ForSpec...... language to define their semantics. Integrating these technologies under the umbrella of Microsoft Visual Studio IDE allows DSL designers to utilize a single development environment for developing their desired domain-specific languages....

  16. The ONKALO area model. Version 1

    International Nuclear Information System (INIS)

    Kemppainen, K.; Ahokas, T.; Ahokas, H.; Paulamaeki, S.; Paananen, M.; Gehoer, S.; Front, K.

    2007-11-01

    The geological model of the ONKALO area consists of three submodels: the lithological model, the brittle deformation model and the alteration model. The lithological model gives properties of definite rock units that can be defined on the basis the migmatite structures, textures and modal compositions. The brittle deformation model describes the results of brittle deformation, where geophysical and hydrogeological results are added. The alteration model describes occurrence of different alteration types and its possible effects. The rocks of Olkiluoto can be divided into two major classes: (1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, tonalitic-granodioriticgranitic gneisses, mica gneisses, quartz gneisses and mafic gneisses, and (2) igneous rocks including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite structure: veined gneisses, stromatic gneisses and diatexitic gneisses. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subject to polyphased ductile deformation, including five stages. In 3D modelling of the lithological units, an assumption has been made, on the basis of measurements in outcrops, investigation trenches and drill cores, that the pervasive, composite foliation produced as a result a polyphase ductile deformation has a rather constant attitude in the ONKALO area. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from the surface to the drillholes. The bedrock in the Olkiluoto site has been subject to extensive hydrothermal alteration, which has taken place at reasonably low temperature conditions, the estimated temperature interval being from slightly over 300 deg C to less than 100 deg C. Two types of alteration can be observed: (1) pervasive (disseminated

  17. Micro dosimetry model. An extended version

    International Nuclear Information System (INIS)

    Vroegindewey, C.

    1994-07-01

    In an earlier study a relative simple mathematical model has been constructed to simulate the energy transfer on a cellular scale and thus gain insight in the fundamental processes of BNCT. Based on this work, a more realistic micro dosimetry model is developed. The new facets of the model are: the treatment of proton recoil, the calculation of the distribution of energy depositions, and the determination of the number of particles crossing the target nucleus subdivided in place of origin. Besides these extensions, new stopping power tables for the emitted particles are generated and biased Monte Carlo techniques are used to reduce computer time. (orig.)

  18. Land-Use Portfolio Modeler, Version 1.0

    Science.gov (United States)

    Taketa, Richard; Hong, Makiko

    2010-01-01

    -on-investment. The portfolio model, now known as the Land-Use Portfolio Model (LUPM), provided the framework for the development of the Land-Use Portfolio Modeler, Version 1.0 software (LUPM v1.0). The software provides a geographic information system (GIS)-based modeling tool for evaluating alternative risk-reduction mitigation strategies for specific natural-hazard events. The modeler uses information about a specific natural-hazard event and the features exposed to that event within the targeted study region to derive a measure of a given mitigation strategy`s effectiveness. Harnessing the spatial capabilities of a GIS enables the tool to provide a rich, interactive mapping environment in which users can create, analyze, visualize, and compare different

  19. A penalized framework for distributed lag non-linear models.

    Science.gov (United States)

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  20. Cytoview: Development of a cell modelling framework

    Indian Academy of Sciences (India)

    2007-07-06

    Jul 6, 2007 ... The different issues that have been addressed are ontologies, feature description and model building. The framework describes dotted representations and tree data structures to integrate diverse pieces of data and parametric models enabling size, shape and location descriptions. The framework serves ...

  1. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  2. A framework for sustainable interorganizational business model

    OpenAIRE

    Neupane, Ganesh Prasad; Haugland, Sven A.

    2016-01-01

    Drawing on literature on business model innovations and sustainability, this paper develops a framework for sustainable interorganizational business models. The aim of the framework is to enhance the sustainability of firms’ business models by enabling firms to create future value by taking into account environmental, social and economic factors. The paper discusses two themes: (1) application of the term sustainability to business model innovation, and (2) implications of integrating sustain...

  3. IDC Use Case Model Survey Version 1.1.

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carr, Dorthe B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This document contains the brief descriptions for the actors and use cases contained in the IDC Use Case Model. REVISIONS Version Date Author/Team Revision Description Authorized by V1.0 12/2014 SNL IDC Reengineering Project Team Initial delivery M. Harris V1.1 2/2015 SNL IDC Reengineering Project Team Iteration I2 Review Comments M. Harris

  4. IDC Use Case Model Survey Version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Carr, Dorthe B.; Harris, James M.

    2014-12-01

    This document contains the brief descriptions for the actors and use cases contained in the IDC Use Case Model Survey. REVISIONS Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Re- engineering Project Team Initial delivery M. Harris

  5. A conceptual model specification language (CMSL Version 2)

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    1992-01-01

    Version 2 of a language (CMSL) to specify conceptual models is defined. CMSL consists of two parts, the value specification language VSL and the object spercification language OSL. There is a formal semantics and an inference system for CMSL but research on this still continues. A method for

  6. A Unified Framework for Systematic Model Improvement

    DEFF Research Database (Denmark)

    Kristensen, Niels Rode; Madsen, Henrik; Jørgensen, Sten Bay

    2003-01-01

    A unified framework for improving the quality of continuous time models of dynamic systems based on experimental data is presented. The framework is based on an interplay between stochastic differential equation (SDE) modelling, statistical tests and multivariate nonparametric regression. This co......-batch bioreactor, where it is illustrated how an incorrectly modelled biomass growth rate can be pinpointed and an estimate provided of the functional relation needed to properly describe it....

  7. Frameworks for understanding and describing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian; Roslender, Robin

    2014-01-01

    This chapter provides in a chronological fashion an introduction to six frameworks that one can apply to describing, understanding and also potentially innovating business models. These six frameworks have been chosen carefully as they represent six very different perspectives on business models...... and in this manner “complement” each other. There are a multitude of varying frameworks that could be chosen from and we urge the reader to search and trial these for themselves. The six chosen models (year of release in parenthesis) are: • Service-Profit Chain (1994) • Strategic Systems Auditing (1997) • Strategy...... Maps (2001) • Intellectual Capital Statements (2003) • Chesbrough’s framework for Open Business Models (2006) • Business Model Canvas (2008)...

  8. Fiscal impacts model documentation. Version 1.0

    International Nuclear Information System (INIS)

    Beck, S.L.; Scott, M.J.

    1986-05-01

    The Fiscal Impacts (FI) Model, Version 1.0 was developed under Pacific Northwest Laboratory's Monitored Retrievable Storage (MRS) Program to aid in development of the MRS Reference Site Environmental Document (PNL 5476). It computes estimates of 182 fiscal items for state and local government jurisdictions, using input data from the US Census Bureau's 1981 Survey of Governments and local population forecasts. The model can be adapted for any county or group of counties in the United States

  9. Model-based DSL frameworks

    NARCIS (Netherlands)

    Ivanov, Ivan; Bézivin, J.; Jouault, F.; Valduriez, P.

    2006-01-01

    More than five years ago, the OMG proposed the Model Driven Architecture (MDA™) approach to deal with the separation of platform dependent and independent aspects in information systems. Since then, the initial idea of MDA evolved and Model Driven Engineering (MDE) is being increasingly promoted to

  10. A useful framework for optimal replacement models

    International Nuclear Information System (INIS)

    Aven, Terje; Dekker, Rommert

    1997-01-01

    In this note we present a general framework for optimization of replacement times. It covers a number of models, including various age and block replacement models, and allows a uniform analysis for all these models. A relation to the marginal cost concept is described

  11. Efficient Modelling and Generation of Markov Automata (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    2012-01-01

    This paper introduces a framework for the efficient modelling and generation of Markov automata. It consists of (1) the data-rich process-algebraic language MAPA, allowing concise modelling of systems with nondeterminism, probability and Markovian timing; (2) a restricted form of the language, the

  12. ONKALO rock mechanics model (RMM). Version 2.3

    Energy Technology Data Exchange (ETDEWEB)

    Haekkinen, T.; Merjama, S.; Moenkkoenen, H. [WSP Finland, Helsinki (Finland)

    2014-07-15

    The Rock Mechanics Model of the ONKALO rock volume includes the most important rock mechanics features and parameters at the Olkiluoto site. The main objective of the model is to be a tool to predict rock properties, rock quality and hence provide an estimate for the rock stability of the potential repository at Olkiluoto. The model includes a database of rock mechanics raw data and a block model in which the rock mechanics parameters are estimated through block volumes based on spatial rock mechanics raw data. In this version 2.3, special emphasis was placed on refining the estimation of the block model. The model was divided into rock mechanics domains which were used as constraints during the block model estimation. During the modelling process, a display profile and toolbar were developed for the GEOVIA Surpac software to improve visualisation and access to the rock mechanics data for the Olkiluoto area. (orig.)

  13. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin

    2010-01-01

    Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model...

  14. A framework for API solubility modelling

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Crafts, Peter

    . In addition, most of the models are not predictive and requires experimental data for the calculation of the needed parameters. This work aims at developing an efficient framework for the solubility modelling of Active Pharmaceutical Ingredients (API) in water and organic solvents. With this framework......-SAFT) are used for solubility calculations when the needed interaction parameters or experimental data are available. The CI-UNIFAC is instead used when the previous models lack interaction parameters or when solubility data are not available. A new GC+ model for APIs solvent selection based...... on the hydrophobicity, hydrophilicity and polarity information of the API and solvent is also developed, for performing fast solvent selection and screening. Eventually, all the previous developments are integrated in a framework for their efficient and integrated use. Two case studies are presented: the first...

  15. A mixed model framework for teratology studies.

    Science.gov (United States)

    Braeken, Johan; Tuerlinckx, Francis

    2009-10-01

    A mixed model framework is presented to model the characteristic multivariate binary anomaly data as provided in some teratology studies. The key features of the model are the incorporation of covariate effects, a flexible random effects distribution by means of a finite mixture, and the application of copula functions to better account for the relation structure of the anomalies. The framework is motivated by data of the Boston Anticonvulsant Teratogenesis study and offers an integrated approach to investigate substantive questions, concerning general and anomaly-specific exposure effects of covariates, interrelations between anomalies, and objective diagnostic measurement.

  16. Latest NASA Instrument Cost Model (NICM): Version VI

    Science.gov (United States)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  17. Driver Performance Model: 1. Conceptual Framework

    National Research Council Canada - National Science Library

    Heimerl, Joseph

    2001-01-01

    ...'. At the present time, no such comprehensive model exists. This report discusses a conceptual framework designed to encompass the relationships, conditions, and constraints related to direct, indirect, and remote modes of driving and thus provides a guide or 'road map' for the construction and creation of a comprehensive driver performance model.

  18. Solid Waste Projection Model: Database (Version 1.3)

    International Nuclear Information System (INIS)

    Blackburn, C.L.

    1991-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software and data structures utilized in developing the SWPM Version 1.3 Database. This document is intended for use by experienced database specialists and supports database maintenance, utility development, and database enhancement

  19. Some Remarks on Stochastic Versions of the Ramsey Growth Model

    Czech Academy of Sciences Publication Activity Database

    Sladký, Karel

    2012-01-01

    Roč. 19, č. 29 (2012), s. 139-152 ISSN 1212-074X R&D Projects: GA ČR GAP402/10/1610; GA ČR GAP402/10/0956; GA ČR GAP402/11/0150 Institutional support: RVO:67985556 Keywords : Economic dynamics * Ramsey growth model with disturbance * stochastic dynamic programming * multistage stochastic programs Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/sladky-some remarks on stochastic versions of the ramsey growth model.pdf

  20. H2A Production Model, Version 2 User Guide

    Energy Technology Data Exchange (ETDEWEB)

    Steward, D.; Ramsden, T.; Zuboy, J.

    2008-09-01

    The H2A Production Model analyzes the technical and economic aspects of central and forecourt hydrogen production technologies. Using a standard discounted cash flow rate of return methodology, it determines the minimum hydrogen selling price, including a specified after-tax internal rate of return from the production technology. Users have the option of accepting default technology input values--such as capital costs, operating costs, and capacity factor--from established H2A production technology cases or entering custom values. Users can also modify the model's financial inputs. This new version of the H2A Production Model features enhanced usability and functionality. Input fields are consolidated and simplified. New capabilities include performing sensitivity analyses and scaling analyses to various plant sizes. This User Guide helps users already familiar with the basic tenets of H2A hydrogen production cost analysis get started using the new version of the model. It introduces the basic elements of the model then describes the function and use of each of its worksheets.

  1. Integrated Farm System Model Version 4.3 and Dairy Gas Emissions Model Version 3.3 Software development and distribution

    Science.gov (United States)

    Modeling routines of the Integrated Farm System Model (IFSM version 4.2) and Dairy Gas Emission Model (DairyGEM version 3.2), two whole-farm simulation models developed and maintained by USDA-ARS, were revised with new components for: (1) simulation of ammonia (NH3) and greenhouse gas emissions gene...

  2. QMM – A Quarterly Macroeconomic Model of the Icelandic Economy. Version 2.0

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi

    This paper documents and describes Version 2.0 of the Quarterly Macroeconomic Model of the Central Bank of Iceland (QMM). QMM and the underlying quarterly database have been under construction since 2001 at the Research and Forecasting Division of the Economics Department at the Bank and was first...... implemented in the forecasting round for the Monetary Bulletin 2006/1 in March 2006. QMM is used by the Bank for forecasting and various policy simulations and therefore plays a key role as an organisational framework for viewing the medium-term future when formulating monetary policy at the Bank. This paper...

  3. Computational models in physics teaching: a framework

    Directory of Open Access Journals (Sweden)

    Marco Antonio Moreira

    2012-08-01

    Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.

  4. Building crop models within different crop modelling frameworks

    NARCIS (Netherlands)

    Adam, M.Y.O.; Corbeels, M.; Leffelaar, P.A.; Keulen, van H.; Wery, J.; Ewert, F.

    2012-01-01

    Modular frameworks for crop modelling have evolved through simultaneous progress in crop science and software development but differences among these frameworks exist which are not well understood, resulting in potential misuse for crop modelling. In this paper we review differences and similarities

  5. Next generation framework for aquatic modeling of the Earth System

    Science.gov (United States)

    Fekete, B. M.; Wollheim, W. M.; Wisser, D.; Vörösmarty, C. J.

    2009-03-01

    Earth System model development is becoming an increasingly complex task. As scientists attempt to represent the physical and bio-geochemical processes and various feedback mechanisms in unprecedented detail, the models themselves are becoming increasingly complex. At the same time, the complexity of the surrounding IT infrastructure is growing as well. Earth System models must manage a vast amount of data in heterogeneous computing environments. Numerous development efforts are on the way to ease that burden and offer model development platforms that reduce IT challenges and allow scientists to focus on their science. While these new modeling frameworks (e.g. FMS, ESMF, CCA, OpenMI) do provide solutions to many IT challenges (performing input/output, managing space and time, establishing model coupling, etc.), they are still considerably complex and often have steep learning curves. The Next generation Framework for Aquatic Modeling of the Earth System (NextFrAMES, a revised version of FrAMES) have numerous similarities to those developed by other teams, but represents a novel model development paradigm. NextFrAMES is built around a modeling XML that lets modelers to express the overall model structure and provides an API for dynamically linked plugins to represent the processes. The model XML is executed by the NextFrAMES run-time engine that parses the model definition, loads the module plugins, performs the model I/O and executes the model calculations. NextFrAMES has a minimalistic view representing spatial domains and treats every domain (regardless of its layout such as grid, network tree, individual points, polygons, etc.) as vector of objects. NextFrAMES performs computations on multiple domains and interactions between different spatial domains are carried out through couplers. NextFrAMES allows processes to operate at different frequencies by providing rudimentary aggregation and disaggregation facilities. NextFrAMES was designed primarily for

  6. Spatial Modeling for Resources Framework (SMRF)

    Science.gov (United States)

    Spatial Modeling for Resources Framework (SMRF) was developed by Dr. Scott Havens at the USDA Agricultural Research Service (ARS) in Boise, ID. SMRF was designed to increase the flexibility of taking measured weather data and distributing the point measurements across a watershed. SMRF was developed...

  7. Break model comparison in different RELAP5 versions

    International Nuclear Information System (INIS)

    Parzer, I.

    2003-01-01

    The presented work focuses on the break flow prediction in RELAP5/MOD3 code, which is crucial to predict core uncovering and heatup during the Small Break Loss-of-Coolant Accidents (SB LOCA). The code prediction has been compared to the IAEA-SPE-4 experiments conducted on the PMK-2 integral test facilities in Hungary. The simulations have been performed with MOD3.2.2 Beta, MOD3.2.2 Gamma, MOD3.3 Beta and MOD3.3 frozen code version. In the present work we have compared the Ransom-Trapp and Henry-Fauske break model predictions. Additionally, both model predictions have been compared to itself, when used as the main modeling tool or when used as another code option, as so-called 'secret developmental options' on input card no.1. (author)

  8. GLEAM version 3: Global Land Evaporation Datasets and Model

    Science.gov (United States)

    Martens, B.; Miralles, D. G.; Lievens, H.; van der Schalie, R.; de Jeu, R.; Fernandez-Prieto, D.; Verhoest, N.

    2015-12-01

    Terrestrial evaporation links energy, water and carbon cycles over land and is therefore a key variable of the climate system. However, the global-scale magnitude and variability of the flux, and the sensitivity of the underlying physical process to changes in environmental factors, are still poorly understood due to limitations in in situ measurements. As a result, several methods have risen to estimate global patterns of land evaporation from satellite observations. However, these algorithms generally differ in their approach to model evaporation, resulting in large differences in their estimates. One of these methods is GLEAM, the Global Land Evaporation: the Amsterdam Methodology. GLEAM estimates terrestrial evaporation based on daily satellite observations of meteorological variables, vegetation characteristics and soil moisture. Since the publication of the first version of the algorithm (2011), the model has been widely applied to analyse trends in the water cycle and land-atmospheric feedbacks during extreme hydrometeorological events. A third version of the GLEAM global datasets is foreseen by the end of 2015. Given the relevance of having a continuous and reliable record of global-scale evaporation estimates for climate and hydrological research, the establishment of an online data portal to host these data to the public is also foreseen. In this new release of the GLEAM datasets, different components of the model have been updated, with the most significant change being the revision of the data assimilation algorithm. In this presentation, we will highlight the most important changes of the methodology and present three new GLEAM datasets and their validation against in situ observations and an alternative dataset of terrestrial evaporation (ERA-Land). Results of the validation exercise indicate that the magnitude and the spatiotemporal variability of the modelled evaporation agree reasonably well with the estimates of ERA-Land and the in situ

  9. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  10. Solid Waste Projection Model: Database (Version 1.4)

    International Nuclear Information System (INIS)

    Blackburn, C.; Cillan, T.

    1993-09-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software and data structures utilized in developing the SWPM Version 1.4 Database. This document is intended for use by experienced database specialists and supports database maintenance, utility development, and database enhancement. Those interested in using the SWPM database should refer to the SWPM Database User's Guide. This document is available from the PNL Task M Project Manager (D. L. Stiles, 509-372-4358), the PNL Task L Project Manager (L. L. Armacost, 509-372-4304), the WHC Restoration Projects Section Manager (509-372-1443), or the WHC Waste Characterization Manager (509-372-1193)

  11. An evaluation framework for participatory modelling

    Science.gov (United States)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  12. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  13. Talking Cure Models: A Framework of Analysis

    Directory of Open Access Journals (Sweden)

    Christopher Marx

    2017-09-01

    Full Text Available Psychotherapy is commonly described as a “talking cure,” a treatment method that operates through linguistic action and interaction. The operative specifics of therapeutic language use, however, are insufficiently understood, mainly due to a multitude of disparate approaches that advance different notions of what “talking” means and what “cure” implies in the respective context. Accordingly, a clarification of the basic theoretical structure of “talking cure models,” i.e., models that describe therapeutic processes with a focus on language use, is a desideratum of language-oriented psychotherapy research. Against this background the present paper suggests a theoretical framework of analysis which distinguishes four basic components of “talking cure models”: (1 a foundational theory (which suggests how linguistic activity can affect and transform human experience, (2 an experiential problem state (which defines the problem or pathology of the patient, (3 a curative linguistic activity (which defines linguistic activities that are supposed to effectuate a curative transformation of the experiential problem state, and (4 a change mechanism (which defines the processes and effects involved in such transformations. The purpose of the framework is to establish a terminological foundation that allows for systematically reconstructing basic properties and operative mechanisms of “talking cure models.” To demonstrate the applicability and utility of the framework, five distinct “talking cure models” which spell out the details of curative “talking” processes in terms of (1 catharsis, (2 symbolization, (3 narrative, (4 metaphor, and (5 neurocognitive inhibition are introduced and discussed in terms of the framework components. In summary, we hope that our framework will prove useful for the objective of clarifying the theoretical underpinnings of language-oriented psychotherapy research and help to establish a more

  14. AGAMA: Action-based galaxy modeling framework

    Science.gov (United States)

    Vasiliev, Eugene

    2018-05-01

    The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).

  15. An entropic framework for modeling economies

    Science.gov (United States)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  16. Computational modeling of Metal-Organic Frameworks

    Science.gov (United States)

    Sung, Jeffrey Chuen-Fai

    In this work, the metal-organic frameworks MIL-53(Cr), DMOF-2,3-NH 2Cl, DMOF-2,5-NH2Cl, and HKUST-1 were modeled using molecular mechanics and electronic structure. The effect of electronic polarization on the adsorption of water in MIL-53(Cr) was studied using molecular dynamics simulations of water-loaded MIL-53 systems with both polarizable and non-polarizable force fields. Molecular dynamics simulations of the full systems and DFT calculations on representative framework clusters were utilized to study the difference in nitrogen adsorption between DMOF-2,3-NH2Cl and DMOF-2,5-NH 2Cl. Finally, the control of proton conduction in HKUST-1 by complexation of molecules to the Cu open metal site was investigated using the MS-EVB methodology.

  17. Flexible Modeling of Epidemics with an Empirical Bayes Framework

    Science.gov (United States)

    Brooks, Logan C.; Farrow, David C.; Hyun, Sangwon; Tibshirani, Ryan J.; Rosenfeld, Roni

    2015-01-01

    Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to

  18. Implementation of a PETN failure model using ARIA's general chemistry framework

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model, implementation, and validation.

  19. Computer-aided modeling framework – a generic modeling template

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    and test models systematically, efficiently and reliably. In this way, development of products and processes can be made faster, cheaper and more efficient. In this contribution, as part of the framework, a generic modeling template for the systematic derivation of problem specific models is presented....... The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene...

  20. COSMO: a conceptual framework for service modelling and refinement

    NARCIS (Netherlands)

    Quartel, Dick; Steen, Maarten W.A.; Pokraev, S.; van Sinderen, Marten J.

    This paper presents a conceptual framework for service modelling and refinement, called the COSMO (COnceptual Service MOdelling) framework. This framework provides concepts to model and reason about services, and to support operations, such as composition and discovery, which are performed on them

  1. Disposal Systems Evaluation Framework (DSEF) Version 1.0 - Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, Mark [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Blink, James A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fratoni, Massimiliano [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Greenberg, Harris R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Halsey, William G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wolery, Thomas J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-06-03

    The Disposal Systems Evaluation Framework (DSEF) is being developed at Lawrence Livermore National Laboratory to formalize the development and documentation of repository conceptual design options for each waste form and environment combination. This report summarizes current status and plans for the remainder of FY11 and for FY12. This progress report defines the architecture and interface parameters of the DSEF Excel workbook, which contains worksheets that link to each other to provide input and document output from external codes such that concise comparisons between fuel cycles, disposal environments, repository designs and engineered barrier system materials can be performed. Collaborations between other Used Fuel Disposition Campaign work packages and US Department of Energy / Nuclear Energy campaigns are clearly identified. File naming and configuration management is recommended to allow automated abstraction of data from multiple DSEF runs.

  2. BehavePlus fire modeling system, version 5.0: Variables

    Science.gov (United States)

    Patricia L. Andrews

    2009-01-01

    This publication has been revised to reflect updates to version 4.0 of the BehavePlus software. It was originally published as the BehavePlus fire modeling system, version 4.0: Variables in July, 2008.The BehavePlus fire modeling system is a computer program based on mathematical models that describe wildland fire behavior and effects and the...

  3. A Framework for Developing the Structure of Public Health Economic Models.

    Science.gov (United States)

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics

  4. Business Model Innovation: An Integrative Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Bernd Wirtz

    2017-01-01

    Full Text Available Purpose: The point of departure of this exploratory study is the gap between the increasing importance of business model innovation (BMI in science and management and the limited conceptual assistance available. Therefore, the study identi es and explores scattered BMI insights and deduces them into an integrative framework to enhance our understanding about this phenomenon and to present a helpful guidance for researchers and practitioners. Design/Methodology/Approach: The study identi es BMI insights through a literature-based investigation and consolidates them into an integrative BMI framework that presents the key elements and dimensions of BMI as well as their presumed relationships. Findings: The study enhances our understanding about the key elements and dimensions of BMI, presents further conceptual insights into the BMI phenomenon, supplies implications for science and management, and may serve as a helpful guidance for future research. Practical Implications: The presented framework provides managers with a tool to identify critical BMI issues and can serve as a conceptual BMI guideline. Research limitations: Given the vast amount of academic journals, it is unlikely that every applicable scienti c publication is included in the analysis. The illustrative examples are descriptive in nature, and thus do not provide empirical validity. Several implications for future research are provided. Originality/Value: The study’s main contribution lies in the unifying approach of the dispersed BMI knowledge. Since our understanding of BMI is still limited, this study should provide the necessary insights and conceptual assistance to further develop the concept and guide its practical application.

  5. NETPATH-WIN: an interactive user version of the mass-balance model, NETPATH

    Science.gov (United States)

    El-Kadi, A. I.; Plummer, Niel; Aggarwal, P.

    2011-01-01

    NETPATH-WIN is an interactive user version of NETPATH, an inverse geochemical modeling code used to find mass-balance reaction models that are consistent with the observed chemical and isotopic composition of waters from aquatic systems. NETPATH-WIN was constructed to migrate NETPATH applications into the Microsoft WINDOWS® environment. The new version facilitates model utilization by eliminating difficulties in data preparation and results analysis of the DOS version of NETPATH, while preserving all of the capabilities of the original version. Through example applications, the note describes some of the features of NETPATH-WIN as applied to adjustment of radiocarbon data for geochemical reactions in groundwater systems.

  6. Computerized transportation model for the NRC Physical Protection Project. Versions I and II

    International Nuclear Information System (INIS)

    Anderson, G.M.

    1978-01-01

    Details on two versions of a computerized model for the transportation system of the NRC Physical Protection Project are presented. The Version I model permits scheduling of all types of transport units associated with a truck fleet, including truck trailers, truck tractors, escort vehicles and crews. A fixed-fleet itinerary construction process is used in which iterations on fleet size are required until the service requirements are satisfied. The Version II model adds an aircraft mode capability and provides for a more efficient non-fixed-fleet itinerary generation process. Test results using both versions are included

  7. A Procurement Performance Model for Construction Frameworks

    Directory of Open Access Journals (Sweden)

    Terence Y M Lam

    2015-07-01

    Full Text Available Collaborative construction frameworks have been developed in the United Kingdom (UK to create longer term relationships between clients and suppliers in order to improve project outcomes. Research undertaken into highways maintenance set within a major county council has confirmed that such collaborative procurement methods can improve time, cost and quality of construction projects. Building upon this and examining the same single case, this research aims to develop a performance model through identification of performance drivers in the whole project delivery process including pre and post contract phases. A priori performance model based on operational and sociological constructs was proposed and then checked by a pilot study. Factor analysis and central tendency statistics from the questionnaires as well as content analysis from the interview transcripts were conducted. It was confirmed that long term relationships, financial and non-financial incentives and stronger communication are the sociological behaviour factors driving performance. The interviews also established that key performance indicators (KPIs can be used as an operational measure to improve performance. With the posteriori performance model, client project managers can effectively collaboratively manage contractor performance through procurement measures including use of longer term and KPIs for the contract so that the expected project outcomes can be achieved. The findings also make significant contribution to construction framework procurement theory by identifying the interrelated sociological and operational performance drivers. This study is set predominantly in the field of highways civil engineering. It is suggested that building based projects or other projects that share characteristics are grouped together and used for further research of the phenomena discovered.

  8. Institutional Transformation Version 2.5 Modeling and Planning.

    Energy Technology Data Exchange (ETDEWEB)

    Villa, Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mizner, Jack H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Passell, Howard D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gallegos, Gerald R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Peplinski, William John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vetter, Douglas W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Christopher A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Malczynski, Leonard A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Addison, Marlin [Arizona State Univ., Mesa, AZ (United States); Schaffer, Matthew A. [Bridgers and Paxton Engineering Firm, Albuquerque, NM (United States); Higgins, Matthew W. [Vibrantcy, Albuquerque, NM (United States)

    2017-02-01

    ., representing 80% of the energy consumption at SNL. SNL has been able to leverage this model to estimate energy savings potential of many competing ECMs. The results helped high level decision makers to create energy reduction goals for SNL. These resources also have multiple applications for use of the models as individual buildings. In addition to the building module, a solar module built in Powersim Studio (r) allows planners to evaluate the potential photovoltaic (PV) energy generation potential for flat plate PV, concentrating solar PV, and concentration solar thermal technologies at multiple sites across SNL's New Mexico campus. Development of the IX modeling framework was a unique collaborative effort among planners and engineers in SNL's facilities division; scientists and computer modelers in SNL's research and development division; faculty from Arizona State University; and energy modelers from Bridger and Paxton Consulting Engineers Incorporated.

  9. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    Science.gov (United States)

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  10. A Framework for the Specification of Acquisition Models

    National Research Council Canada - National Science Library

    Meyers, B

    2001-01-01

    .... The timing properties associated with the items receives special treatment. The value of a framework is that one can develop specifications of various acquisition models, such as waterfall, spiral, or incremental, as instances of that framework...

  11. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  12. Evaluation of a new CNRM-CM6 model version for seasonal climate predictions

    Science.gov (United States)

    Volpi, Danila; Ardilouze, Constantin; Batté, Lauriane; Dorel, Laurant; Guérémy, Jean-François; Déqué, Michel

    2017-04-01

    This work presents the quality assessment of a new version of the Météo-France coupled climate prediction system, which has been developed in the EU COPERNICUS Climate Change Services framework to carry out seasonal forecast. The system is based on the CNRM-CM6 model, with Arpege-Surfex 6.2.2 as atmosphere/land component and Nemo 3.2 as ocean component, which has directly embedded the sea-ice component Gelato 6.0. In order to have a robust diagnostic, the experiment is composed by 60 ensemble members generated with stochastic dynamic perturbations. The experiment has been performed over a 37-year re-forecast period from 1979 to 2015, with two start dates per year, respectively in May 1st and November 1st. The evaluation of the predictive skill of the model is shown under two perspectives: on the one hand, the ability of the model to faithfully respond to positive or negative ENSO, NAO and QBO events, independently of the predictability of these events. Such assessment is carried out through a composite analysis, and shows that the model succeeds in reproducing the main patterns for 2-meter temperature, precipitation and geopotential height at 500 hPa during the winter season. On the other hand, the model predictive skill of the same events (positive and negative ENSO, NAO and QBO) is evaluated.

  13. A Smallholder Socio-hydrological Modelling Framework

    Science.gov (United States)

    Pande, S.; Savenije, H.; Rathore, P.

    2014-12-01

    Small holders are farmers who own less than 2 ha of farmland. They often have low productivity and thus remain at subsistence level. A fact that nearly 80% of Indian farmers are smallholders, who merely own a third of total farmlands and belong to the poorest quartile, but produce nearly 40% of countries foodgrains underlines the importance of understanding the socio-hydrology of a small holder. We present a framework to understand the socio-hydrological system dynamics of a small holder. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. It allows us to study sustainability of small holder farming systems under various settings. We apply the framework to understand the socio-hydrology of small holders in Aurangabad, Maharashtra, India. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydroclimatic variability. This presentation discusses two aspects in particular: whether government interventions to absolve the debt of farmers is enough and what is the value of investing in local storages that can buffer intra-annual variability in rainfall and strengthening the safety-nets either by creating opportunities for alternative sources of income or by crop diversification.

  14. ANLECIS-1: Version of ANLECIS Program for Calculations with the Asymetric Rotational Model

    International Nuclear Information System (INIS)

    Lopez Mendez, R.; Garcia Moruarte, F.

    1986-01-01

    A new modified version of the ANLECIS Code is reported. This version allows to fit simultaneously the cross section of the direct process by the asymetric rotational model, and the cross section of the compound nucleus process by the Hauser-Feshbach formalism with the modern statistical corrections. The calculations based in this version show a dependence of the compound nucleus cross section with respect to the asymetric parameter γ. (author). 19 refs

  15. Systematic identification of crystallization kinetics within a generic modelling framework

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Gernaey, Krist

    2012-01-01

    A systematic development of constitutive models within a generic modelling framework has been developed for use in design, analysis and simulation of crystallization operations. The framework contains a tool for model identification connected with a generic crystallizer modelling tool-box, a tool...

  16. CENTURY: Modeling Ecosystem Responses to Climate Change, Version 4 (VEMAP 1995)

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: The CENTURY model, Version 4, is a general model of plant-soil nutrient cycling that is being used to simulate carbon and nutrient dynamics for different...

  17. CENTURY: Modeling Ecosystem Responses to Climate Change, Version 4 (VEMAP 1995)

    Data.gov (United States)

    National Aeronautics and Space Administration — The CENTURY model, Version 4, is a general model of plant-soil nutrient cycling that is being used to simulate carbon and nutrient dynamics for different types of...

  18. A Constrained and Versioned Data Model for TEAM Data

    Science.gov (United States)

    Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.

    2009-04-01

    The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. At each TEAM Site, data is gathered as defined by the protocols and according to a predefined sampling schedule. The TEAM data is organized and stored in a database based on the TEAM spatio-temporal data model. This data model is at the core of the TEAM Information System - it consumes and executes spatio-temporal queries, and analytical functions that are performed on TEAM data, and defines the object data types, relationships and operations that maintain database integrity. The TEAM data model contains object types including types for observation objects (e.g. bird, butterfly and trees), sampling unit, person, role, protocol, site and the relationship of these object types. Each observation data record is a set of attribute values of an observation object and is always associated with a sampling unit, an observation timestamp or time interval, a versioned protocol and data collectors. The operations on the TEAM data model can be classified as read operations, insert operations and update operations. Following are some typical operations: The operation get(site, protocol, [sampling unit block, sampling unit,] start time, end time) returns all data records using the specified protocol and collected at the specified site, block

  19. A hybrid version of swan for fast and efficient practical wave modelling

    NARCIS (Netherlands)

    M. Genseberger (Menno); J. Donners

    2016-01-01

    htmlabstractIn the Netherlands, for coastal and inland water applications, wave modelling with SWAN has become a main ingredient. However, computational times are relatively high. Therefore we investigated the parallel efficiency of the current MPI and OpenMP versions of SWAN. The MPI version is

  20. Business process model repositories : framework and survey

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2009-01-01

    Large organizations often run hundreds or even thousands of business processes. Managing such large collections of business processes is a challenging task. Intelligent software can assist in that task by providing common repository functions such as storage, search and version management. They can

  1. A framework for business process model repositories

    NARCIS (Netherlands)

    Yan, Z.; Grefen, P.W.P.J.; Muehlen, zur M.; Su, J.

    2010-01-01

    Large organizations often run hundreds or even thousands of business processes. Managing such large collections of business processes is a challenging task. Intelligent software can assist in that task by providing common repository functions such as storage, search and version management. They can

  2. A preliminary three-dimensional geological framework model for Yucca Mountain

    International Nuclear Information System (INIS)

    Stirewalt, G.L.; Henderson, D.B.

    1995-01-01

    A preliminary three-dimensional geological framework model has been developed for the potential high-level radioactive waste disposal site at Yucca Mountain. The model is based on field data and was constructed using EarthVision (Version 2.0) software. It provides the basic geological framework in which variations in geological parameters and features in and adjacent to the repository block can be illustrated and analyzed. With further refinement and modification of the model through incorporation of additional data, it can be used by Nuclear Regulatory Commission (NRC) staff to determine whether representation of subsurface geological features in Department of Energy models is reasonable. Consequently, NRC staff will be able to use the model during pre-licensing and licensing phases to assess models for analyses of site suitability, design considerations, and repository performance

  3. A Systems Engineering Capability Maturity Model, Version 1.1,

    Science.gov (United States)

    1995-11-01

    of a sequence of actions to be taken to perform a given task. [SECMM] 1. A set of activities ( ISO 12207 ). 2. A set of practices that address the...standards One of the design goals of the SE-CMM effort was to capture the salient concepts from emerging standards and initiatives (e.g.; ISO 9001...history for the SE-CMM: Version Designator Content Change Notes Release 1 • architecture rationale • Process Areas • ISO (SPICE) BPG 0.05 summary

  4. Business model framework applications in health care: A systematic review.

    Science.gov (United States)

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-11-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  5. Faculty Perceptions about Teaching Online: Exploring the Literature Using the Technology Acceptance Model as an Organizing Framework

    Science.gov (United States)

    Wingo, Nancy Pope; Ivankova, Nataliya V.; Moss, Jacqueline A.

    2017-01-01

    Academic leaders can better implement institutional strategic plans to promote online programs if they understand faculty perceptions about teaching online. An extended version of a model for technology acceptance, or TAM2 (Venkatesh & Davis, 2000), provided a framework for surveying and organizing the research literature about factors that…

  6. Tier I Rice Model - Version 1.0 - Guidance for Estimating Pesticide Concentrations in Rice Paddies

    Science.gov (United States)

    Describes a Tier I Rice Model (Version 1.0) for estimating surface water exposure from the use of pesticides in rice paddies. The concentration calculated can be used for aquatic ecological risk and drinking water exposure assessments.

  7. LAMMPS Framework for Dynamic Bonding and an Application Modeling DNA

    DEFF Research Database (Denmark)

    Svaneborg, Carsten

    2012-01-01

    and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework....

  8. A qualitative evaluation approach for energy system modelling frameworks

    DEFF Research Database (Denmark)

    Wiese, Frauke; Hilpert, Simon; Kaldemeyer, Cord

    2018-01-01

    properties define how useful it is in regard to the existing challenges. For energy system models, evaluation methods exist, but we argue that many decisions upon properties are rather made on the model generator or framework level. Thus, this paper presents a qualitative approach to evaluate frameworks...

  9. A Framework for Formal Modeling and Analysis of Organizations

    NARCIS (Netherlands)

    Jonker, C.M.; Sharpanskykh, O.; Treur, J.; P., Yolum

    2007-01-01

    A new, formal, role-based, framework for modeling and analyzing both real world and artificial organizations is introduced. It exploits static and dynamic properties of the organizational model and includes the (frequently ignored) environment. The transition is described from a generic framework of

  10. Estimating Parameters for the PVsyst Version 6 Photovoltaic Module Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    We present an algorithm to determine parameters for the photovoltaic module perf ormance model encoded in the software package PVsyst(TM) version 6. Our method operates on current - voltage (I - V) measured over a range of irradiance and temperature conditions. We describe the method and illustrate its steps using data for a 36 cell crystalli ne silicon module. We qualitatively compare our method with one other technique for estimating parameters for the PVsyst(TM) version 6 model .

  11. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  12. Population Balance Models: A useful complementary modelling framework for future WWTP modelling

    DEFF Research Database (Denmark)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel

    2014-01-01

    Population Balance Models (PBMs) represent a powerful modelling framework for the description of the dynamics of properties that are characterised by statistical distributions. This has been demonstrated in many chemical engineering applications. Modelling efforts of several current and future unit...

  13. Population balance models: a useful complementary modelling framework for future WWTP modelling

    DEFF Research Database (Denmark)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel

    2015-01-01

    Population balance models (PBMs) represent a powerful modelling framework for the description of the dynamics of properties that are characterised by distributions. This distribution of properties under transient conditions has been demonstrated in many chemical engineering applications. Modelling...

  14. A Simulation and Modeling Framework for Space Situational Awareness

    International Nuclear Information System (INIS)

    Olivier, S.S.

    2008-01-01

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated

  15. A Flexible Atmospheric Modeling Framework for the CESM

    Energy Technology Data Exchange (ETDEWEB)

    Randall, David [Colorado State University; Heikes, Ross [Colorado State University; Konor, Celal [Colorado State University

    2014-11-12

    We have created two global dynamical cores based on the unified system of equations and Z-grid staggering on an icosahedral grid, which are collectively called UZIM (Unified Z-grid Icosahedral Model). The z-coordinate version (UZIM-height) can be run in hydrostatic and nonhydrostatic modes. The sigma-coordinate version (UZIM-sigma) runs in only hydrostatic mode. The super-parameterization has been included as a physics option in both models. The UZIM versions with the super-parameterization are called SUZI. With SUZI-height, we have completed aquaplanet runs. With SUZI-sigma, we are making aquaplanet runs and realistic climate simulations. SUZI-sigma includes realistic topography and a SiB3 model to parameterize the land-surface processes.

  16. Integrated Biosphere Simulator Model (IBIS), Version 2.5

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: The Integrated Biosphere Simulator (or IBIS) is designed to be a comprehensive model of the terrestrial biosphere. Tthe model represents a wide range of...

  17. Integrated Biosphere Simulator Model (IBIS), Version 2.5

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Biosphere Simulator (or IBIS) is designed to be a comprehensive model of the terrestrial biosphere. Tthe model represents a wide range of processes,...

  18. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    T. Miller

    2004-11-15

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site

  19. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    International Nuclear Information System (INIS)

    Miller, T.

    2004-01-01

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site-scale SZ flow model, the HFM

  20. Prediction models for successful external cephalic version: a systematic review.

    Science.gov (United States)

    Velzel, Joost; de Hundt, Marcella; Mulder, Frederique M; Molkenboer, Jan F M; Van der Post, Joris A M; Mol, Ben W; Kok, Marjolein

    2015-12-01

    To provide an overview of existing prediction models for successful ECV, and to assess their quality, development and performance. We searched MEDLINE, EMBASE and the Cochrane Library to identify all articles reporting on prediction models for successful ECV published from inception to January 2015. We extracted information on study design, sample size, model-building strategies and validation. We evaluated the phases of model development and summarized their performance in terms of discrimination, calibration and clinical usefulness. We collected different predictor variables together with their defined significance, in order to identify important predictor variables for successful ECV. We identified eight articles reporting on seven prediction models. All models were subjected to internal validation. Only one model was also validated in an external cohort. Two prediction models had a low overall risk of bias, of which only one showed promising predictive performance at internal validation. This model also completed the phase of external validation. For none of the models their impact on clinical practice was evaluated. The most important predictor variables for successful ECV described in the selected articles were parity, placental location, breech engagement and the fetal head being palpable. One model was assessed using discrimination and calibration using internal (AUC 0.71) and external validation (AUC 0.64), while two other models were assessed with discrimination and calibration, respectively. We found one prediction model for breech presentation that was validated in an external cohort and had acceptable predictive performance. This model should be used to council women considering ECV. Copyright © 2015. Published by Elsevier Ireland Ltd.

  1. AN INTEGRATED MODELING FRAMEWORK FOR CARBON MANAGEMENT TECHNOLOGIES

    Energy Technology Data Exchange (ETDEWEB)

    Anand B. Rao; Edward S. Rubin; Michael B. Berkenpas

    2004-03-01

    CO{sub 2} capture and storage (CCS) is gaining widespread interest as a potential method to control greenhouse gas emissions from fossil fuel sources, especially electric power plants. Commercial applications of CO{sub 2} separation and capture technologies are found in a number of industrial process operations worldwide. Many of these capture technologies also are applicable to fossil fuel power plants, although applications to large-scale power generation remain to be demonstrated. This report describes the development of a generalized modeling framework to assess alternative CO{sub 2} capture and storage options in the context of multi-pollutant control requirements for fossil fuel power plants. The focus of the report is on post-combustion CO{sub 2} capture using amine-based absorption systems at pulverized coal-fired plants, which are the most prevalent technology used for power generation today. The modeling framework builds on the previously developed Integrated Environmental Control Model (IECM). The expanded version with carbon sequestration is designated as IECM-cs. The expanded modeling capability also includes natural gas combined cycle (NGCC) power plants and integrated coal gasification combined cycle (IGCC) systems as well as pulverized coal (PC) plants. This report presents details of the performance and cost models developed for an amine-based CO{sub 2} capture system, representing the baseline of current commercial technology. The key uncertainties and variability in process design, performance and cost parameters which influence the overall cost of carbon mitigation also are characterized. The new performance and cost models for CO{sub 2} capture systems have been integrated into the IECM-cs, along with models to estimate CO{sub 2} transport and storage costs. The CO{sub 2} control system also interacts with other emission control technologies such as flue gas desulfurization (FGD) systems for SO{sub 2} control. The integrated model is applied to

  2. Conceptualising Business Models: Definitions, Frameworks and Classifications

    OpenAIRE

    Erwin Fielt

    2013-01-01

    The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in...

  3. The MiniBIOS model (version 1A4) at the RIVM

    NARCIS (Netherlands)

    Uijt de Haag PAM; Laheij GMH

    1993-01-01

    This report is the user's guide of the MiniBIOS model, version 1A4. The model is operational at the Laboratory of Radiation Research of the RIVM. MiniBIOS is a simulation model for calculating the transport of radionuclides in the biosphere and the consequential radiation dose to humans. The

  4. Integrated Baseline System (IBS) Version 2.0: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  5. A Testing and Implementation Framework (TIF) for Climate Adaptation Innovations : Initial Version of the TIF - Deliverable 5.1

    NARCIS (Netherlands)

    Sebastian, A.G.; Lendering, K.T.; van Loon-Steensma, J.M.; Paprotny, D.; Bellamy, Rob; Willems, Patrick; van Loenhout, Joris; Colaço, Conceição; Dias, Susana; Nunes, Leónia; Rego, Francisco; Koundouri, Phoebe; Xepapadeas, Petros; Vassilopoulos, Achilleas; Wiktor, Paweł; Wysocka-Golec, Justyna

    2017-01-01

    Currently there is no internationally accepted framework for assessing the readiness of innovations that reduce disaster risk. To fill this gap, BRIGAID is developing a standard, comprehensive Testing and Implementation Framework (TIF). The TIF is designed to provide innovators with a framework for

  6. An Agent-Based Modeling Framework for Simulating Human Exposure to Environmental Stresses in Urban Areas

    Directory of Open Access Journals (Sweden)

    Liang Emlyn Yang

    2018-04-01

    Full Text Available Several approaches have been used to assess potential human exposure to environmental stresses and achieve optimal results under various conditions, such as for example, for different scales, groups of people, or points in time. A thorough literature review in this paper identifies the research gap regarding modeling approaches for assessing human exposure to environment stressors, and it indicates that microsimulation tools are becoming increasingly important in human exposure assessments of urban environments, in which each person is simulated individually and continuously. The paper further describes an agent-based model (ABM framework that can dynamically simulate human exposure levels, along with their daily activities, in urban areas that are characterized by environmental stresses such as air pollution and heat stress. Within the framework, decision-making processes can be included for each individual based on rule-based behavior in order to achieve goals under changing environmental conditions. The ideas described in this paper are implemented in a free and open source NetLogo platform. A basic modeling scenario of the ABM framework in Hamburg, Germany, demonstrates its utility in various urban environments and individual activity patterns, as well as its portability to other models, programs, and frameworks. The prototype model can potentially be extended to support environmental incidence management through exploring the daily routines of different groups of citizens, and comparing the effectiveness of different strategies. Further research is needed to fully develop an operational version of the model.

  7. Generic Model Predictive Control Framework for Advanced Driver Assistance Systems

    NARCIS (Netherlands)

    Wang, M.

    2014-01-01

    This thesis deals with a model predictive control framework for control design of Advanced Driver Assistance Systems, where car-following tasks are under control. The framework is applied to design several autonomous and cooperative controllers and to examine the controller properties at the

  8. A Modeling Framework for Conventional and Heat Integrated Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2013-01-01

    In this paper, a generic, modular model framework for describing fluid separation by distillation is presented. At present, the framework is able to describe a conventional distillation column and a heat-integrated distillation column, but due to a modular structure the database can be further...

  9. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...... of the functionality of a system. The article further presents the application of the framework based on a product example. Finally, an empirical study in industry is presented. Therein, feedback on the potential of the proposed framework to support interdisciplinary design practice as well as on areas of further...

  10. Microsoft Repository Version 2 and the Open Information Model.

    Science.gov (United States)

    Bernstein, Philip A.; Bergstraesser, Thomas; Carlson, Jason; Pal, Shankar; Sanders, Paul; Shutt, David

    1999-01-01

    Describes the programming interface and implementation of the repository engine and the Open Information Model for Microsoft Repository, an object-oriented meta-data management facility that ships in Microsoft Visual Studio and Microsoft SQL Server. Discusses Microsoft's component object model, object manipulation, queries, and information…

  11. Prediction models for successful external cephalic version: a systematic review

    NARCIS (Netherlands)

    Velzel, Joost; de Hundt, Marcella; Mulder, Frederique M.; Molkenboer, Jan F. M.; van der Post, Joris A. M.; Mol, Ben W.; Kok, Marjolein

    2015-01-01

    To provide an overview of existing prediction models for successful ECV, and to assess their quality, development and performance. We searched MEDLINE, EMBASE and the Cochrane Library to identify all articles reporting on prediction models for successful ECV published from inception to January 2015.

  12. STORM WATER MANAGEMENT MODEL USER'S MANUAL VERSION 5.0

    Science.gov (United States)

    The EPA Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. SWMM was first developed in 1971 and has undergone several major upgrade...

  13. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  14. Conceptualising Business Models: Definitions, Frameworks and Classifications

    Directory of Open Access Journals (Sweden)

    Erwin Fielt

    2013-12-01

    Full Text Available The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1 explicitly including the customer value concept in the business model definition and focussing on value creation, (2 presenting four core dimensions that business model elements need to cover, (3 arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy (4 stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5 suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.

  15. Programming Entity Framework

    CERN Document Server

    Lerman, Julia

    2010-01-01

    Get a thorough introduction to ADO.NET Entity Framework 4 -- Microsoft's core framework for modeling and interacting with data in .NET applications. The second edition of this acclaimed guide provides a hands-on tour of the framework latest version in Visual Studio 2010 and .NET Framework 4. Not only will you learn how to use EF4 in a variety of applications, you'll also gain a deep understanding of its architecture and APIs. Written by Julia Lerman, the leading independent authority on the framework, Programming Entity Framework covers it all -- from the Entity Data Model and Object Service

  16. Flipped version of the supersymmetric strongly coupled preon model

    Energy Technology Data Exchange (ETDEWEB)

    Fajfer, S. (Institut za Fiziku, University of Sarajevo, Sarajevo, (Yugoslavia)); Milekovic, M.; Tadic, D. (Zavod za Teorijsku Fiziku, Prirodoslovno-Matematicki Fakultet, University of Zagreb, Croatia, (Yugoslavia))

    1989-12-01

    In the supersymmetric SU(5) (SUSY SU(5)) composite model (which was described in an earlier paper) the fermion mass terms can be easily constructed. The SUSY SU(5){direct product}U(1), i.e., flipped, composite model possesses a completely analogous composite-particle spectrum. However, in that model one cannot construct a renormalizable superpotential which would generate fermion mass terms. This contrasts with the standard noncomposite grand unified theories (GUT's) in which both the Georgi-Glashow electrical charge embedding and its flipped counterpart lead to the renormalizable theories.

  17. Radarsat Antarctic Mapping Project Digital Elevation Model, Version 2

    Data.gov (United States)

    National Aeronautics and Space Administration — The high-resolution Radarsat Antarctic Mapping Project (RAMP) Digital Elevation Model (DEM) combines topographic data from a variety of sources to provide consistent...

  18. U.S. Coastal Relief Model - Southern California Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NGDC's U.S. Coastal Relief Model (CRM) provides a comprehensive view of the U.S. coastal zone integrating offshore bathymetry with land topography into a seamless...

  19. ONKALO rock mechanics model (RMM) - Version 2.0

    International Nuclear Information System (INIS)

    Moenkkoenen, H.; Hakala, M.; Paananen, M.; Laine, E.

    2012-02-01

    The Rock Mechanics Model of the ONKALO rock volume is a description of the significant features and parameters related to rock mechanics. The main objective is to develop a tool to predict the rock properties, quality and hence the potential for stress failure which can then be used for continuing design of the ONKALO and the repository. This is the second implementation of the Rock Mechanics Model and it includes sub-models of the intact rock strength, in situ stress, thermal properties, rock mass quality and properties of the brittle deformation zones. Because of the varying quantities of available data for the different parameters, the types of presentations also vary: some data sets can be presented in the style of a 3D block model but, in other cases, a single distribution represents the whole rock volume hosting the ONKALO. (orig.)

  20. Geological model of the ONKALO area version 0

    International Nuclear Information System (INIS)

    Paananen, M.; Paulamaeki, S.; Gehoer, S.; Kaerki, A.

    2006-03-01

    The geological model of the ONKALO area is composed of four submodels: ductile deformation model, lithological model, brittle deformation model and alteration model. The ductile deformation model describes and models the products of polyphase ductile deformation, which facilitates the definition of dimensions and geometrical properties of individual lithological units determined in the lithological model. The lithological model describes the properties of rock units that can be defined on the basis the migmatite structures, textures and modal compositions. The brittle deformation model describes the products of multiple phases of brittle deformation, and the alteration model describes the types, occurrence and the effects of the hydrothermal alteration. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subject to five stages of ductile deformation. This resulted in a pervasive, composite foliation which shows a rather constant attitude in the ONKALO area. Based on observations in outcrops, investigation trenches and drill cores, 3D modelling of the lithological units is carried out assuming that the contacts are quasiconcordant. Using this assumption, the strike and dip of the foliation has been used as a tool to correlate the lithologies between the drillholes, and from surface and tunnel outcrops to drillholes. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from surface to drillholes. The rocks at Olkiluoto can be divided into two major groups: (1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, homogeneous tonaliticgranodioritic- granitic gneisses, mica gneisses and quartzitic gneisses, and mafic gneisses, (2) igneous rocks, including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite

  1. The Oak Ridge Competitive Electricity Dispatch (ORCED) Model Version 9

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, Stanton W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Baek, Young Sun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-11-01

    The Oak Ridge Competitive Electricity Dispatch (ORCED) model dispatches power plants in a region to meet the electricity demands for any single given year up to 2030. It uses publicly available sources of data describing electric power units such as the National Energy Modeling System and hourly demands from utility submittals to the Federal Energy Regulatory Commission that are projected to a future year. The model simulates a single region of the country for a given year, matching generation to demands and predefined net exports from the region, assuming no transmission constraints within the region. ORCED can calculate a number of key financial and operating parameters for generating units and regional market outputs including average and marginal prices, air emissions, and generation adequacy. By running the model with and without changes such as generation plants, fuel prices, emission costs, plug-in hybrid electric vehicles, distributed generation, or demand response, the marginal impact of these changes can be found.

  2. Due Regard Encounter Model Version 1.0

    Science.gov (United States)

    2013-08-19

    Note that no existing model covers encoun- ters between two IFR aircraft in oceanic airspace. The reason for this is that one cannot observe encounters...encounters between instrument flight rules ( IFR ) and non- IFR traffic beyond 12NM. 2 TABLE 1 Encounter model categories. Aircraft of Interest Intruder...Aircraft Location Flight Rule IFR VFR Noncooperative Noncooperative Conventional Unconventional CONUS IFR C C U X VFR C U U X Offshore IFR C C U X VFR C U

  3. Geological model of the Olkiluoto site Version O

    International Nuclear Information System (INIS)

    Paulamaeki, S.; Paananen, M.; Gehoer, S.

    2006-05-01

    The geological model of the Olkiluoto site consists of four submodels: the lithological model, the ductile deformation model, the brittle deformation model and the alteration model. The lithological model gives properties of definite rock units that can be defined on the basis the migmatite structures, textures and modal compositions. The ductile deformation model describes and models the products of polyphase ductile deformation, which enables to define the dimensions and geometrical properties of individual lithological units determined in the lithological model. The brittle deformation model describes the products of multiple phases of brittle deformation. The alteration model describes the types, occurrence and the effects of the hydrothermal alteration. The rocks of Olkiluoto can be divided into two major classes: (1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, tonalitic-granodioriticgranitic gneisses, mica gneisses, quartz gneisses and mafic gneisses, and (2) igneous rocks including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite structure: veined gneisses, stromatic gneisses and diatexitic gneisses. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subject to polyphased ductile deformation, including five stages. In 3D modelling of the lithological units, an assumption has been made, on the basis of measurements in outcrops, investigation trenches and drill cores, that the pervasive, composite foliation produced as a result a polyphase ductile deformation has a rather constant attitude in the ONKALO area. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from the surface to the drillholes. The bedrock in the Olkiluoto site has been subject to extensive hydrothermal alteration

  4. Theoretical Models, Assessment Frameworks and Test Construction.

    Science.gov (United States)

    Chalhoub-Deville, Micheline

    1997-01-01

    Reviews the usefulness of proficiency models influencing second language testing. Findings indicate that several factors contribute to the lack of congruence between models and test construction and make a case for distinguishing between theoretical models. Underscores the significance of an empirical, contextualized and structured approach to the…

  5. POSITIVE LEADERSHIP MODELS: THEORETICAL FRAMEWORK AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Javier Blanch, Francisco Gil

    2016-09-01

    Full Text Available The objective of this article is twofold; firstly, we establish the theoretical boundaries of positive leadership and the reasons for its emergence. It is related to the new paradigm of positive psychology that has recently been shaping the scope of organizational knowledge. This conceptual framework has triggered the development of the various forms of positive leadership (i.e. transformational, servant, spiritual, authentic, and positive. Although the construct does not seem univocally defined, these different types of leadership overlap and share a significant affinity. Secondly, we review the empirical evidence that shows the impact of positive leadership in organizations and we highlight the positive relationship between these forms of leadership and key positive organizational variables. Lastly, we analyse future research areas in order to further develop this concept.

  6. Mars Global Reference Atmospheric Model 2010 Version: Users Guide

    Science.gov (United States)

    Justh, H. L.

    2014-01-01

    This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.

  7. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  8. Red Storm usage model :Version 1.12.

    Energy Technology Data Exchange (ETDEWEB)

    Jefferson, Karen L.; Sturtevant, Judith E.

    2005-12-01

    Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.

  9. Zig-zag version of the Frenkel-Kontorova model

    DEFF Research Database (Denmark)

    Christiansen, Peter Leth; Savin, A.V.; Zolotaryuk, Alexander

    1996-01-01

    We study a generalization of the Frenkel-Kontorova model which describes a zig-zag chain of particles coupled by both the first- and second-neighbor harmonic forces and subjected to a planar substrate with a commensurate potential relief. The particles are supposed to have two degrees of freedom...

  10. A conceptual framework for measuring airline business model convergence

    OpenAIRE

    Daft, Jost; Albers, Sascha

    2012-01-01

    This paper develops a measurement framework that synthesizes the airline and strategy literature to identify relevant dimensions and elements of airline business models. The applicability of this framework for describing airline strategies and structures and, based on this conceptualization, for assessing the potential convergence of airline business models over time is then illustrated using a small sample of five German passenger airlines. For this sample, the perception of a rapprochement ...

  11. Modelling framework for groundwater flow at Sellafield

    International Nuclear Information System (INIS)

    Hooper, A.J.; Billington, D.E.; Herbert, A.W.

    1995-01-01

    The principal objective of Nirex is to develop a single deep geological repository for the safe disposal of low- and intermediate-level radioactive waste. In safety assessment, use is made of a variety of conceptual models that form the basis for modelling of the pathways by which radionuclides might return to the environment. In this paper, the development of a conceptual model for groundwater flow and transport through fractured rock on the various scales of interest is discussed. The approach is illustrated by considering how some aspects of the conceptual model are developed in particular numerical models. These representations of the conceptual model use fracture network geometries based on realistic rock properties. (author). refs., figs., tabs

  12. The ``KILDER`` air pollution modelling system, version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Gram, F.

    1996-12-31

    This report describes the KILDER Air Pollution Modelling System, which is a system of small PC-programs for calculation of long-term emission, dispersion, concentration and exposure from different source categories. The system consists of three parts: (1) The dispersion models POI-KILD and ARE-KILD for point- and area-sources, respectively, (2) Meterological programs WINDFREC, STABFREC and METFREC, (3) Supporting programs for calculating emissions and exposure and for operating with binary data fields. The file structure is based on binary files with data fields. The data fields are matrices with different types of values and may be read into the computer or be calculated in other programs. 19 refs., 22 figs., 3 tabs.

  13. Implementation of a parallel version of a regional climate model

    Energy Technology Data Exchange (ETDEWEB)

    Gerstengarbe, F.W. [ed.; Kuecken, M. [Potsdam-Institut fuer Klimafolgenforschung (PIK), Potsdam (Germany); Schaettler, U. [Deutscher Wetterdienst, Offenbach am Main (Germany). Geschaeftsbereich Forschung und Entwicklung

    1997-10-01

    A regional climate model developed by the Max Planck Institute for Meterology and the German Climate Computing Centre in Hamburg based on the `Europa` and `Deutschland` models of the German Weather Service has been parallelized and implemented on the IBM RS/6000 SP computer system of the Potsdam Institute for Climate Impact Research including parallel input/output processing, the explicit Eulerian time-step, the semi-implicit corrections, the normal-mode initialization and the physical parameterizations of the German Weather Service. The implementation utilizes Fortran 90 and the Message Passing Interface. The parallelization strategy used is a 2D domain decomposition. This report describes the parallelization strategy, the parallel I/O organization, the influence of different domain decomposition approaches for static and dynamic load imbalances and first numerical results. (orig.)

  14. External Validation of a Prediction Model for Successful External Cephalic Version

    NARCIS (Netherlands)

    de Hundt, Marcella; Vlemmix, Floortje; Kok, Marjolein; van der Steeg, Jan W.; Bais, Joke M.; Mol, Ben W.; van der Post, Joris A.

    2012-01-01

    We sought external validation of a prediction model for the probability of a successful external cephalic version (ECV). We evaluated the performance of the prediction model with calibration and discrimination. For clinical practice, we developed a score chart to calculate the probability of a

  15. Regularized integrable version of the one-dimensional quantum sine-Gordon model

    International Nuclear Information System (INIS)

    Japaridze, G.I.; Nersesyan, A.A.; Wiegmann, P.B.

    1983-01-01

    The authors derive a regularized exactly solvable version of the one-dimensional quantum sine-Gordon model proceeding from the exact solution of the U(1)-symmetric Thirring model. The ground state and the excitation spectrum are obtained in the region ν 2 < 8π. (Auth.)

  16. Connected Equipment Maturity Model Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Butzbaugh, Joshua B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mayhorn, Ebony T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sullivan, Greg [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Whalen, Scott A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-05-01

    The Connected Equipment Maturity Model (CEMM) evaluates the high-level functionality and characteristics that enable equipment to provide the four categories of energy-related services through communication with other entities (e.g., equipment, third parties, utilities, and users). The CEMM will help the U.S. Department of Energy, industry, energy efficiency organizations, and research institutions benchmark the current state of connected equipment and identify capabilities that may be attained to reach a more advanced, future state.

  17. System cost model user's manual, version 1.2

    International Nuclear Information System (INIS)

    Shropshire, D.

    1995-06-01

    The System Cost Model (SCM) was developed by Lockheed Martin Idaho Technologies in Idaho Falls, Idaho and MK-Environmental Services in San Francisco, California to support the Baseline Environmental Management Report sensitivity analysis for the U.S. Department of Energy (DOE). The SCM serves the needs of the entire DOE complex for treatment, storage, and disposal (TSD) of mixed low-level, low-level, and transuranic waste. The model can be used to evaluate total complex costs based on various configuration options or to evaluate site-specific options. The site-specific cost estimates are based on generic assumptions such as waste loads and densities, treatment processing schemes, existing facilities capacities and functions, storage and disposal requirements, schedules, and cost factors. The SCM allows customization of the data for detailed site-specific estimates. There are approximately forty TSD module designs that have been further customized to account for design differences for nonalpha, alpha, remote-handled, and transuranic wastes. The SCM generates cost profiles based on the model default parameters or customized user-defined input and also generates costs for transporting waste from generators to TSD sites

  18. Geological Model of the Olkiluoto Site. Version 2.0

    International Nuclear Information System (INIS)

    Aaltonen, I.

    2010-10-01

    The rocks of Olkiluoto can be divided into two major classes: 1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, tonalitic-granodioriticgranitic gneisses, mica gneisses, quartz gneisses and mafic gneisses, and 2) igneous rocks including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite structure: veined gneisses, stromatic gneisses and diatexitic gneisses. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subjected to polyphased ductile deformation, consisting of five stages, the D2 being locally the most intensive phase, producing thrust-related folding, strong migmatisation and pervasive foliation. In 3D modelling of the lithological units, an assumption has been made, on the basis of measurements in the outcrops, investigation trenches and drill cores, that the pervasive, composite foliation produced as a result of polyphase ductile deformation has a rather constant attitude in the ONKALO area. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from the surface to the drillholes. In addition, the largest ductile deformation zones and tectonic units are described in 3D model. The bedrock at the Olkiluoto site has been subjected to extensive hydrothermal alteration, which has taken place at reasonably low temperature conditions, the estimated temperature interval being from slightly over 300 deg C to less than 100 deg C. Two types of alteration can be observed: firstly, pervasive alteration and secondly fracturecontrolled alteration. Clay mineralisation and sulphidisation are the most prominent alteration events in the site area. Sulphides are located in the uppermost part of the model volume following roughly the foliation and lithological trend. Kaolinite is also mainly located in the

  19. A magnetic version of the Smilansky-Solomyak model

    Czech Academy of Sciences Publication Activity Database

    Barseghyan, Diana; Exner, Pavel

    2017-01-01

    Roč. 50, č. 48 (2017), č. článku 485203. ISSN 1751-8113 R&D Projects: GA ČR GA17-01706S Institutional support: RVO:61389005 Keywords : Smilansky-Solomyak model * spectral transition * homegeneous magnetic field * discrete spectrum * essential spectrum Subject RIV: BE - Theoretical Physics OBOR OECD: Atomic, molecular and chemical physics (physics of atoms and molecules including collision, interaction with radiation, magnetic resonances, Mössbauer effect) Impact factor: 1.857, year: 2016

  20. Multilevel Models: Conceptual Framework and Applicability

    Directory of Open Access Journals (Sweden)

    Roxana-Otilia-Sonia Hrițcu

    2015-10-01

    Full Text Available Individuals and the social or organizational groups they belong to can be viewed as a hierarchical system situated on different levels. Individuals are situated on the first level of the hierarchy and they are nested together on the higher levels. Individuals interact with the social groups they belong to and are influenced by these groups. Traditional methods that study the relationships between data, like simple regression, do not take into account the hierarchical structure of the data and the effects of a group membership and, hence, results may be invalidated. Unlike standard regression modelling, the multilevel approach takes into account the individuals as well as the groups to which they belong. To take advantage of the multilevel analysis it is important that we recognize the multilevel characteristics of the data. In this article we introduce the outlines of multilevel data and we describe the models that work with such data. We introduce the basic multilevel model, the two-level model: students can be nested into classes, individuals into countries and the general two-level model can be extended very easily to several levels. Multilevel analysis has begun to be extensively used in many research areas. We present the most frequent study areas where multilevel models are used, such as sociological studies, education, psychological research, health studies, demography, epidemiology, biology, environmental studies and entrepreneurship. We support the idea that since hierarchies exist everywhere, multilevel data should be recognized and analyzed properly by using multilevel modelling.

  1. An Ising model for metal-organic frameworks

    Science.gov (United States)

    Höft, Nicolas; Horbach, Jürgen; Martín-Mayor, Victor; Seoane, Beatriz

    2017-08-01

    We present a three-dimensional Ising model where lines of equal spins are frozen such that they form an ordered framework structure. The frame spins impose an external field on the rest of the spins (active spins). We demonstrate that this "porous Ising model" can be seen as a minimal model for condensation transitions of gas molecules in metal-organic frameworks. Using Monte Carlo simulation techniques, we compare the phase behavior of a porous Ising model with that of a particle-based model for the condensation of methane (CH4) in the isoreticular metal-organic framework IRMOF-16. For both models, we find a line of first-order phase transitions that end in a critical point. We show that the critical behavior in both cases belongs to the 3D Ising universality class, in contrast to other phase transitions in confinement such as capillary condensation.

  2. Mediation Analysis in a Latent Growth Curve Modeling Framework

    Science.gov (United States)

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  3. Theories and Frameworks for Online Education: Seeking an Integrated Model

    Science.gov (United States)

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  4. PUMA Version 6 Multiplatform with Facilities to be coupled with other Simulation Models

    International Nuclear Information System (INIS)

    Grant, Carlos

    2013-01-01

    PUMA is a code for nuclear reactor calculation used in all nuclear installations in Argentina for simulation of fuel management, power cycles and transient events by means of spatial kinetic diffusion theory in 3D. For the versions used up to now the WINDOWS platform was used with very good results. Nowadays PUMA must work in different operative systems, LINUX among others, and must also have facilities to be coupled with other models. For this reason this new version was reprogrammed in ADA, language oriented to a safe programming and be found in any operative system. In former versions PUMA was executed through macro instructions written in LOGO. For this version it is possible to use also PYTHON, which makes also possible the access in execution time to internal data of PUMA. The use of PYTHON allows a easy way to couple PUMA with other codes. The possibilities of this new version of PUMA are shown by means of examples of input data and process control using PYTHON and LOGO. It is discussed the implementation of this methodology in other codes to be coupled with PUMA for versions run in WINDOWS and LINUX. (author)

  5. Real time natural object modeling framework

    International Nuclear Information System (INIS)

    Rana, H.A.; Shamsuddin, S.M.; Sunar, M.H.

    2008-01-01

    CG (Computer Graphics) is a key technology for producing visual contents. Currently computer generated imagery techniques are being developed and applied, particularly in the field of virtual reality applications, film production, training and flight simulators, to provide total composition of realistic computer graphic images. Natural objects like clouds are an integral feature of the sky without them synthetic outdoor scenes seem unrealistic. Modeling and animating such objects is a difficult task. Most systems are difficult to use, as they require adjustment of numerous, complex parameters and are non-interactive. This paper presents an intuitive, interactive system to artistically model, animate, and render visually convincing clouds using modern graphics hardware. A high-level interface models clouds through the visual use of cubes. Clouds are rendered by making use of hardware accelerated API -OpenGL. The resulting interactive design and rendering system produces perceptually convincing cloud models that can be used in any interactive system. (author)

  6. Cytoview: Development of a cell modelling framework

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    is an important aspect of cell modelling. ... 1Supercomputer Education and Research Centreand 2Bioinformatics Centre, Indian Institute ... Important aspects in each panel are listed. ... subsumption relationship, in which the child term is a more.

  7. Uniform California earthquake rupture forecast, version 3 (UCERF3): the time-independent model

    Science.gov (United States)

    Field, Edward H.; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David D.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin R.; Page, Morgan T.; Parsons, Thomas; Powers, Peter M.; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua; ,

    2013-01-01

    In this report we present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation assumptions and to include multifault ruptures, both limitations of the previous model (UCERF2). The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level "grand inversion" that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (for example, magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1,440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded because of lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (for example, constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of

  8. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  9. Geological model of the Olkiluoto site. Version 1.0

    International Nuclear Information System (INIS)

    Mattila, J.; Aaltonen, I.; Kemppainen, K.

    2008-01-01

    The rocks of Olkiluoto can be divided into two major classes: (1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, tonalitic-granodioriticgranitic gneisses, mica gneisses, quartz gneisses and mafic gneisses, and (2) igneous rocks including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite structure: veined gneisses, stromatic gneisses and diatexitic gneisses. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subjected to polyphased ductile deformation, consisting of five stages, the D2 being locally the most intensive phase, producing thrust-related folding, strong migmatisation and pervasive foliation. In 3D modelling of the lithological units, an assumption has been made, on the basis of measurements in the outcrops, investigation trenches and drill cores, that the pervasive, composite foliation produced as a result of polyphase ductile deformation has a rather constant attitude in the ONKALO area. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from the surface to the drillholes. The bedrock at the Olkiluoto site has been subjected to extensive hydrothermal alteration, which has taken place at reasonably low temperature conditions, the estimated temperature interval being from slightly over 300 deg C to less than 100 deg C. Two types of alteration can be observed: (1) pervasive (disseminated) alteration and (2) fracture-controlled (veinlet) alteration. Kaolinisation and sulphidisation are the most prominent alteration events in the site area. Sulphides are located in the uppermost part of the model volume following roughly the lithological trend (slightly dipping to the SE). Kaolinite is also located in the uppermost part, but the orientation is opposite to the main lithological trend

  10. A Conceptual Framework of Business Model Emerging Resilience

    OpenAIRE

    Goumagias, Nik; Fernandes, Kiran; Cabras, Ignazio; Li, Feng; Shao, Jianhao; Devlin, Sam; Hodge, Victoria Jane; Cowling, Peter Ivan; Kudenko, Daniel

    2016-01-01

    In this paper we introduce an environmentally driven conceptual framework of Business Model change. Business models acquired substantial momentum in academic literature during the past decade. Several studies focused on what exactly constitutes a Business Model (role model, recipe, architecture etc.) triggering a theoretical debate about the Business Model’s components and their corresponding dynamics and relationships. In this paper, we argue that for Business Models as cognitive structures,...

  11. A general modeling framework for describing spatially structured population dynamics

    Science.gov (United States)

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance

  12. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  13. A proposed best practice model validation framework for banks

    Directory of Open Access Journals (Sweden)

    Pieter J. (Riaan de Jongh

    2017-06-01

    Full Text Available Background: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks and other financial institutions. Regulatory guidance indicates that meticulous monitoring of all phases of model development and implementation is required to mitigate this risk. Considerable resources must be mobilised for this purpose. The exercise must embrace model development, assembly, implementation, validation and effective governance. Setting: Model validation practices are generally patchy, disparate and sometimes contradictory, and although the Basel Accord and some regulatory authorities have attempted to establish guiding principles, no definite set of global standards exists. Aim: Assessing the available literature for the best validation practices. Methods: This comprehensive literature study provided a background to the complexities of effective model management and focussed on model validation as a component of model risk management. Results: We propose a coherent ‘best practice’ framework for model validation. Scorecard tools are also presented to evaluate if the proposed best practice model validation framework has been adequately assembled and implemented. Conclusion: The proposed best practice model validation framework is designed to assist firms in the construction of an effective, robust and fully compliant model validation programme and comprises three principal elements: model validation governance, policy and process.

  14. Theoretical Tinnitus framework: A Neurofunctional Model

    Directory of Open Access Journals (Sweden)

    Iman Ghodratitoostani

    2016-08-01

    Full Text Available Subjective tinnitus is the conscious (attended awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional tinnitus model to indicate that the conscious perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional tinnitus model includes the peripheral auditory system, the thalamus, the limbic system, brain stem, basal ganglia, striatum and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the sourceless sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be associated with aversive stimuli similar to abnormal neural activity in generating the phantom sound. Cognitive and emotional reactions depend on general

  15. Theoretical Tinnitus Framework: A Neurofunctional Model.

    Science.gov (United States)

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C B; Sani, Siamak S; Ekhtiari, Hamed; Sanchez, Tanit G

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the "sourceless" sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  16. A Framework for PSS Business Models: Formalization and Application

    OpenAIRE

    Adrodegari, Federico; Saccani, Nicola; Kowalkowski, Christian

    2016-01-01

    In order to successfully move "from products to solutions", companies need to redesign their business model. Nevertheless, service oriented BMs in product-centric firms are under-investigated in the literature: very few works develop a scheme of analysis of such BMs. To provide a first step into closing this gap, we propose a new framework to describe service-oriented BMs, pointing out the main BM components and related PSS characteristics. Thus, the proposed framework aims to help companies ...

  17. Technical Note: Description and assessment of a nudged version of the new dynamics Unified Model

    Directory of Open Access Journals (Sweden)

    O. Morgenstern

    2008-03-01

    Full Text Available We present a "nudged" version of the Met Office general circulation model, the Unified Model. We constrain this global climate model using ERA-40 re-analysis data with the aim of reproducing the observed "weather" over a year from September 1999. Quantitative assessments are made of its performance, focusing on dynamical aspects of nudging and demonstrating that the "weather" is well simulated.

  18. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we...... aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv...

  19. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    Science.gov (United States)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  20. Programming Entity Framework

    CERN Document Server

    Lerman, Julia

    2009-01-01

    Programming Entity Framework is a thorough introduction to Microsoft's new core framework for modeling and interacting with data in .NET applications. This highly-acclaimed book not only gives experienced developers a hands-on tour of the Entity Framework and explains its use in a variety of applications, it also provides a deep understanding of its architecture and APIs -- knowledge that will be extremely valuable as you shift to the Entity Framework version in .NET Framework 4.0 and Visual Studio 2010. From the Entity Data Model (EDM) and Object Services to EntityClient and the Metadata Work

  1. Constructing rule-based models using the belief functions framework

    NARCIS (Netherlands)

    Almeida, R.J.; Denoeux, T.; Kaymak, U.; Greco, S.; Bouchon-Meunier, B.; Coletti, G.; Fedrizzi, M.; Matarazzo, B.; Yager, R.R.

    2012-01-01

    Abstract. We study a new approach to regression analysis. We propose a new rule-based regression model using the theoretical framework of belief functions. For this purpose we use the recently proposed Evidential c-means (ECM) to derive rule-based models solely from data. ECM allocates, for each

  2. Designing the Distributed Model Integration Framework – DMIF

    NARCIS (Netherlands)

    Belete, Getachew F.; Voinov, Alexey; Morales, Javier

    2017-01-01

    We describe and discuss the design and prototype of the Distributed Model Integration Framework (DMIF) that links models deployed on different hardware and software platforms. We used distributed computing and service-oriented development approaches to address the different aspects of

  3. A framework for quantifying net benefits of alternative prognostic models

    NARCIS (Netherlands)

    Rapsomaniki, E.; White, I.R.; Wood, A.M.; Thompson, S.G.; Feskens, E.J.M.; Kromhout, D.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit)

  4. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  5. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Karali, Nihan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  6. A community-based framework for aquatic ecosystem models

    NARCIS (Netherlands)

    Trolle, D.; Hamilton, D.P.; Hipsey, M.R.; Bolding, K.; Bruggeman, J.; Mooij, W.M.; Janse, J.H.; Nielsen, A.; Jeppesen, E.; Elliott, J.A.; Makler-Pick, V.; Petzoldt, T.; Rinke, K.; Flindt, M.R.; Arhonditsis, G.B.; Gal, G.; Bjerring, R.; Tominaga, K.; Hoen, 't J.; Downing, A.S.; Marques, D.M.; Fragoso, C.R.; Sondergaard, M.; Hanson, P.C.

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through a

  7. Site investigation SFR. Hydrogeological modelling of SFR. Model version 0.2

    Energy Technology Data Exchange (ETDEWEB)

    Oehman, Johan (Golder Associates AB (Sweden)); Follin, Sven (SF GeoLogic (Sweden))

    2010-01-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) has conducted site investigations for a planned extension of the existing final repository for short-lived radioactive waste (SFR). A hydrogeological model is developed in three model versions, which will be used for safety assessment and design analyses. This report presents a data analysis of the currently available hydrogeological data from the ongoing Site Investigation SFR (KFR27, KFR101, KFR102A, KFR102B, KFR103, KFR104, and KFR105). The purpose of this work is to develop a preliminary hydrogeological Discrete Fracture Network model (hydro-DFN) parameterisation that can be applied in regional-scale modelling. During this work, the Geologic model had not yet been updated for the new data set. Therefore, all analyses were made to the rock mass outside Possible Deformation Zones, according to Single Hole Interpretation. Owing to this circumstance, it was decided not to perform a complete hydro-DFN calibration at this stage. Instead focus was re-directed to preparatory test cases and conceptual questions with the aim to provide a sound strategy for developing the hydrogeological model SFR v. 1.0. The presented preliminary hydro-DFN consists of five fracture sets and three depth domains. A statistical/geometrical approach (connectivity analysis /Follin et al. 2005/) was performed to estimate the size (i.e. fracture radius) distribution of fractures that are interpreted as Open in geologic mapping of core data. Transmissivity relations were established based on an assumption of a correlation between the size and evaluated specific capacity of geologic features coupled to inflows measured by the Posiva Flow Log device (PFL-f data). The preliminary hydro-DFN was applied in flow simulations in order to test its performance and to explore the role of PFL-f data. Several insights were gained and a few model technical issues were raised. These are summarised in Table 5-1

  8. Site investigation SFR. Hydrogeological modelling of SFR. Model version 0.2

    International Nuclear Information System (INIS)

    Oehman, Johan; Follin, Sven

    2010-01-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) has conducted site investigations for a planned extension of the existing final repository for short-lived radioactive waste (SFR). A hydrogeological model is developed in three model versions, which will be used for safety assessment and design analyses. This report presents a data analysis of the currently available hydrogeological data from the ongoing Site Investigation SFR (KFR27, KFR101, KFR102A, KFR102B, KFR103, KFR104, and KFR105). The purpose of this work is to develop a preliminary hydrogeological Discrete Fracture Network model (hydro-DFN) parameterisation that can be applied in regional-scale modelling. During this work, the Geologic model had not yet been updated for the new data set. Therefore, all analyses were made to the rock mass outside Possible Deformation Zones, according to Single Hole Interpretation. Owing to this circumstance, it was decided not to perform a complete hydro-DFN calibration at this stage. Instead focus was re-directed to preparatory test cases and conceptual questions with the aim to provide a sound strategy for developing the hydrogeological model SFR v. 1.0. The presented preliminary hydro-DFN consists of five fracture sets and three depth domains. A statistical/geometrical approach (connectivity analysis /Follin et al. 2005/) was performed to estimate the size (i.e. fracture radius) distribution of fractures that are interpreted as Open in geologic mapping of core data. Transmissivity relations were established based on an assumption of a correlation between the size and evaluated specific capacity of geologic features coupled to inflows measured by the Posiva Flow Log device (PFL-f data). The preliminary hydro-DFN was applied in flow simulations in order to test its performance and to explore the role of PFL-f data. Several insights were gained and a few model technical issues were raised. These are summarised in Table 5-1

  9. User's guide to the Yucca Mountain Integrating Model (YMIM) Version 2.1

    International Nuclear Information System (INIS)

    Gansemer, J.; Lamont, A.

    1995-04-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the engineered barrier system. It contains models of the processes of waste container failure and nuclide release from the fuel rods. YMIM is driven by scenarios of container and rod temperature, near-field chemistry, and near-field hydrology provided by other modules. It is designed to be highly modular so that a model of an individual process can be easily modified to replaced without interfering with the models of other processes. This manual describes the process models and provides instructions for setting up and running YMIM Version 2.1

  10. A software engineering perspective on environmental modeling framework design: The object modeling system

    Science.gov (United States)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  11. Use of ARM Data to address the Climate Change Further Development and Applications of A Multi-scale Modeling Framework

    Energy Technology Data Exchange (ETDEWEB)

    David A. Randall; Marat Khairoutdinov

    2007-12-14

    The Colorado State University (CSU) Multi-scale Modeling Framework (MMF) is a new type of general circulation model (GCM) that replaces the conventional parameterizations of convection, clouds and boundary layer with a cloud-resolving model (CRM) embedded into each grid column. The MMF that we have been working with is a “super-parameterized” version of the Community Atmosphere Model (CAM). As reported in the publications listed below, we have done extensive work with the model. We have explored the MMF’s performance in several studies, including an AMIP run and a CAPT test, and we have applied the MMF to an analysis of climate sensitivity.

  12. The Lagrangian particle dispersion model FLEXPART-WRF VERSION 3.1

    Energy Technology Data Exchange (ETDEWEB)

    Brioude, J.; Arnold, D.; Stohl, A.; Cassiani, M.; Morton, Don; Seibert, P.; Angevine, W. M.; Evan, S.; Dingwell, A.; Fast, Jerome D.; Easter, Richard C.; Pisso, I.; Bukhart, J.; Wotawa, G.

    2013-11-01

    The Lagrangian particle dispersion model FLEXPART was originally designed for cal- culating long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. In the meantime FLEXPART has evolved into a comprehensive tool for atmospheric transport modeling and analysis at different scales. This multiscale need from the modeler community has encouraged new developments in FLEXPART. In this document, we present a version that works with the Weather Research and Forecasting (WRF) mesoscale meteoro- logical model. Simple procedures on how to run FLEXPART-WRF are presented along with special options and features that differ from its predecessor versions. In addition, test case data, the source code and visualization tools are provided to the reader as supplementary material.

  13. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    . To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging...

  14. A Bayesian framework for parameter estimation in dynamical models.

    Directory of Open Access Journals (Sweden)

    Flávio Codeço Coelho

    Full Text Available Mathematical models in biology are powerful tools for the study and exploration of complex dynamics. Nevertheless, bringing theoretical results to an agreement with experimental observations involves acknowledging a great deal of uncertainty intrinsic to our theoretical representation of a real system. Proper handling of such uncertainties is key to the successful usage of models to predict experimental or field observations. This problem has been addressed over the years by many tools for model calibration and parameter estimation. In this article we present a general framework for uncertainty analysis and parameter estimation that is designed to handle uncertainties associated with the modeling of dynamic biological systems while remaining agnostic as to the type of model used. We apply the framework to fit an SIR-like influenza transmission model to 7 years of incidence data in three European countries: Belgium, the Netherlands and Portugal.

  15. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation.

    Science.gov (United States)

    Mangado, Nerea; Ceresa, Mario; Duchateau, Nicolas; Kjer, Hans Martin; Vera, Sergio; Dejea Velardo, Hector; Mistrik, Pavel; Paulsen, Rasmus R; Fagertun, Jens; Noailly, Jérôme; Piella, Gemma; González Ballester, Miguel Ángel

    2016-08-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations was obtained, in an average time of 94 s. The framework has proven to be fast and robust, and is promising for a detailed prognosis of the cochlear implantation surgery.

  16. Modeling of ultrasonic processes utilizing a generic software framework

    Science.gov (United States)

    Bruns, P.; Twiefel, J.; Wallaschek, J.

    2017-06-01

    Modeling of ultrasonic processes is typically characterized by a high degree of complexity. Different domains and size scales must be regarded, so that it is rather difficult to build up a single detailed overall model. Developing partial models is a common approach to overcome this difficulty. In this paper a generic but simple software framework is presented which allows to coupe arbitrary partial models by slave modules with well-defined interfaces and a master module for coordination. Two examples are given to present the developed framework. The first one is the parameterization of a load model for ultrasonically-induced cavitation. The piezoelectric oscillator, its mounting, and the process load are described individually by partial models. These partial models then are coupled using the framework. The load model is composed of spring-damper-elements which are parameterized by experimental results. In the second example, the ideal mounting position for an oscillator utilized in ultrasonic assisted machining of stone is determined. Partial models for the ultrasonic oscillator, its mounting, the simplified contact process, and the workpiece’s material characteristics are presented. For both applications input and output variables are defined to meet the requirements of the framework’s interface.

  17. Prediction of hourly solar radiation with multi-model framework

    International Nuclear Information System (INIS)

    Wu, Ji; Chan, Chee Keong

    2013-01-01

    Highlights: • A novel approach to predict solar radiation through the use of clustering paradigms. • Development of prediction models based on the intrinsic pattern observed in each cluster. • Prediction based on proper clustering and selection of model on current time provides better results than other methods. • Experiments were conducted on actual solar radiation data obtained from a weather station in Singapore. - Abstract: In this paper, a novel multi-model prediction framework for prediction of solar radiation is proposed. The framework started with the assumption that there are several patterns embedded in the solar radiation series. To extract the underlying pattern, the solar radiation series is first segmented into smaller subsequences, and the subsequences are further grouped into different clusters. For each cluster, an appropriate prediction model is trained. Hence a procedure for pattern identification is developed to identify the proper pattern that fits the current period. Based on this pattern, the corresponding prediction model is applied to obtain the prediction value. The prediction result of the proposed framework is then compared to other techniques. It is shown that the proposed framework provides superior performance as compared to others

  18. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    opportunities, a generic modelling framework is proposed to handle this task. This framework outlines a set of building blocks which are necessary for carrying out the economic analysis of various BS applications. Further, special focus is given on describing how to use the rainflow cycle counting algorithm...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so......Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...

  19. Modeling framework for crew decisions during accident sequences

    International Nuclear Information System (INIS)

    Lukic, Y.D.; Worledge, D.H.; Hannaman, G.W.; Spurgin, A.J.

    1986-01-01

    The ability to model the average behavior of operating crews in the course of accident sequences is vital in learning on how to prevent damage to power plants and to maintain safety. This paper summarizes the work carried out in support of a Human Reliability Model framework. This work develops the mathematical framework of the model and identifies the parameters which could be measured in some way, e.g., through simulator experience and/or small scale tests. Selected illustrative examples are presented, of the numerical experiments carried out in order to understand the model sensitivity to parameter variation. These examples ar discussed with the objective of deriving insights of general nature regarding operating of the model which may lead to enhanced understanding of man/machine interactions

  20. New framework for standardized notation in wastewater treatment modelling

    DEFF Research Database (Denmark)

    Corominas, L.; Rieger, L.; Takacs, I.

    2010-01-01

    Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new...... is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper...... notational framework which allows unique and systematic naming of state variables and parameters of biokinetic models in the wastewater treatment field. The symbols are based on one main letter that gives a general description of the state variable or parameter and several subscript levels that provide...

  1. A Liver-Centric Multiscale Modeling Framework for Xenobiotics.

    Directory of Open Access Journals (Sweden)

    James P Sluka

    Full Text Available We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics.

  2. Description and evaluation of the Model for Ozone and Related chemical Tracers, version 4 (MOZART-4

    Directory of Open Access Journals (Sweden)

    L. K. Emmons

    2010-01-01

    Full Text Available The Model for Ozone and Related chemical Tracers, version 4 (MOZART-4 is an offline global chemical transport model particularly suited for studies of the troposphere. The updates of the model from its previous version MOZART-2 are described, including an expansion of the chemical mechanism to include more detailed hydrocarbon chemistry and bulk aerosols. Online calculations of a number of processes, such as dry deposition, emissions of isoprene and monoterpenes and photolysis frequencies, are now included. Results from an eight-year simulation (2000–2007 are presented and evaluated. The MOZART-4 source code and standard input files are available for download from the NCAR Community Data Portal (http://cdp.ucar.edu.

  3. A one-dimensional material transfer model for HECTR version 1.5

    International Nuclear Information System (INIS)

    Geller, A.S.; Wong, C.C.

    1991-08-01

    HECTR (Hydrogen Event Containment Transient Response) is a lumped-parameter computer code developed for calculating the pressure-temperature response to combustion in a nuclear power plant containment building. The code uses a control-volume approach and subscale models to simulate the mass, momentum, and energy transfer occurring in the containment during a loss-of-collant-accident (LOCA). This document describes one-dimensional subscale models for mass and momentum transfer, and the modifications to the code required to implement them. Two problems were analyzed: the first corresponding to a standard problem studied with previous HECTR versions, the second to experiments. The performance of the revised code relative to previous HECTR version is discussed as is the ability of the code to model the experiments. 8 refs., 5 figs., 3 tabs

  4. Integrating knowledge seeking into knowledge management models and frameworks

    Directory of Open Access Journals (Sweden)

    Francois Lottering

    2012-09-01

    Objectives: This article investigates the theoretical status of the knowledge-seeking process in extant KM models and frameworks. It also statistically describes knowledge seeking and knowledge sharing practices in a sample of South African companies. Using this data, it proposes a KM model based on knowledge seeking. Method: Knowledge seeking is traced in a number of KM models and frameworks with a specific focus on Han Lai and Margaret Graham’s adapted KM cycle model, which separates knowledge seeking from knowledge sharing. This empirical investigation used a questionnaire to examine knowledge seeking and knowledge sharing practices in a sample of South African companies. Results: This article critiqued and elaborated on the adapted KM cycle model of Lai and Graham. It identified some of the key features of knowledge seeking practices in the workplace. It showed that knowledge seeking and sharing are human-centric actions and that seeking knowledge uses trust and loyalty as its basis. It also showed that one cannot separate knowledge seeking from knowledge sharing. Conclusion: The knowledge seeking-based KM model elaborates on Lai and Graham’s model. It provides insight into how and where people seek and share knowledge in the workplace. The article concludes that it is necessary to cement the place of knowledge seeking in KM models as well as frameworks and suggests that organisations should apply its findings to improving their knowledge management strategies.

  5. The Hamburg Oceanic Carbon Cycle Circulation Model. Version 1. Version 'HAMOCC2s' for long time integrations

    Energy Technology Data Exchange (ETDEWEB)

    Heinze, C.; Maier-Reimer, E. [Max-Planck-Institut fuer Meteorologie, Hamburg (Germany)

    1999-11-01

    The Hamburg Ocean Carbon Cycle Circulation Model (HAMOCC, configuration HAMOCC2s) predicts the atmospheric carbon dioxide partial pressure (as induced by oceanic processes), production rates of biogenic particulate matter, and geochemical tracer distributions in the water column as well as the bioturbated sediment. Besides the carbon cycle this model version includes also the marine silicon cycle (silicic acid in the water column and the sediment pore waters, biological opal production, opal flux through the water column and opal sediment pore water interaction). The model is based on the grid and geometry of the LSG ocean general circulation model (see the corresponding manual, LSG=Large Scale Geostrophic) and uses a velocity field provided by the LSG-model in 'frozen' state. In contrast to the earlier version of the model (see Report No. 5), the present version includes a multi-layer sediment model of the bioturbated sediment zone, allowing for variable tracer inventories within the complete model system. (orig.)

  6. Composable Framework Support for Software-FMEA Through Model Execution

    Science.gov (United States)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  7. A Model-Driven Framework to Develop Personalized Health Monitoring

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-07-01

    Full Text Available Both distributed healthcare systems and the Internet of Things (IoT are currently hot topics. The latter is a new computing paradigm to enable advanced capabilities in engineering various applications, including those for healthcare. For such systems, the core social requirement is the privacy/security of the patient information along with the technical requirements (e.g., energy consumption and capabilities for adaptability and personalization. Typically, the functionality of the systems is predefined by the patient’s data collected using sensor networks along with medical instrumentation; then, the data is transferred through the Internet for treatment and decision-making. Therefore, systems creation is indeed challenging. In this paper, we propose a model-driven framework to develop the IoT-based prototype and its reference architecture for personalized health monitoring (PHM applications. The framework contains a multi-layered structure with feature-based modeling and feature model transformations at the top and the application software generation at the bottom. We have validated the framework using available tools and developed an experimental PHM to test some aspects of the functionality of the reference architecture in real time. The main contribution of the paper is the development of the model-driven computational framework with emphasis on the synergistic effect of security and energy issues.

  8. Model-based safety architecture framework for complex systems

    NARCIS (Netherlands)

    Schuitemaker, Katja; Rajabali Nejad, Mohammadreza; Braakhuis, J.G.; Podofillini, Luca; Sudret, Bruno; Stojadinovic, Bozidar; Zio, Enrico; Kröger, Wolfgang

    2015-01-01

    The shift to transparency and rising need of the general public for safety, together with the increasing complexity and interdisciplinarity of modern safety-critical Systems of Systems (SoS) have resulted in a Model-Based Safety Architecture Framework (MBSAF) for capturing and sharing architectural

  9. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...... tool based on belief logics and explicit attacker knowledge....

  10. A Graph Based Framework to Model Virus Integration Sites

    Directory of Open Access Journals (Sweden)

    Raffaele Fronza

    2016-01-01

    Here, we addressed the challenge to: 1 define the notion of CIS on graph models, 2 demonstrate that the structure of CIS enters in the category of scale-free networks and 3 show that our network approach analyzes CIS dynamically in an integrated systems biology framework using the Retroviral Transposon Tagged Cancer Gene Database (RTCGD as a testing dataset.

  11. Service business model framework and the service innovation scope

    NARCIS (Netherlands)

    van der Aa, W.; van der Rhee, B.; Victorino, L.

    2011-01-01

    In this paper we present a framework for service business models. We build on three streams of research. The first stream is the service management and marketing literature that focuses on the specific challenges of managing a service business. The second stream consists of research on e-business

  12. Model-Driven Policy Framework for Data Centers

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Kentis, Angelos Mimidis; Soler, José

    2016-01-01

    . Moreover, the lack of simple solutions for managing the configuration and behavior of the DC components makes the DC hard to configure and slow in adapting to changes in business needs. In this paper, we propose a model-driven framework for policy-based management for DCs, to simplify not only the service...

  13. A compositional modelling framework for exploring MPSoC systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan

    2009-01-01

    This paper presents a novel compositional framework for system level performance estimation and exploration of Multi-Processor System On Chip (MPSoC) based systems. The main contributions are the definition of a compositional model which allows quantitative performance estimation to be carried ou...

  14. A Liver-centric Multiscale Modeling Framework for Xenobiotics

    Science.gov (United States)

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study foc...

  15. Data-Model and Inter-Model Comparisons of the GEM Outflow Events Using the Space Weather Modeling Framework

    Science.gov (United States)

    Welling, D. T.; Eccles, J. V.; Barakat, A. R.; Kistler, L. M.; Haaland, S.; Schunk, R. W.; Chappell, C. R.

    2015-12-01

    Two storm periods were selected by the Geospace Environment Modeling Ionospheric Outflow focus group for community collaborative study because of its high magnetospheric activity and extensive data coverage: the September 27 - October 4, 2002 corotating interaction region event and the October 22 - 29 coronal mass ejection event. During both events, the FAST, Polar, Cluster, and other missions made key observations, creating prime periods for data-model comparison. The GEM community has come together to simulate this period using many different methods in order to evaluate models, compare results, and expand our knowledge of ionospheric outflow and its effects on global dynamics. This paper presents Space Weather Modeling Framework (SWMF) simulations of these important periods compared against observations from the Polar TIDE, Cluster CODIF and EFW instruments. Emphasis will be given to the second event. Density and velocity of oxygen and hydrogen throughout the lobes, plasma sheet, and inner magnetosphere will be the focus of these comparisons. For these simulations, the SWMF couples the multifluid version of BATS-R-US MHD to a variety of ionospheric outflow models of varying complexity. The simplest is outflow arising from constant MHD inner boundary conditions. Two first-principles-based models are also leveraged: the Polar Wind Outflow Model (PWOM), a fluid treatment of outflow dynamics, and the Generalized Polar Wind (GPW) model, which combines fluid and particle-in-cell approaches. Each model is capable of capturing a different set of energization mechanisms, yielding different outflow results. The data-model comparisons will illustrate how well each approach captures reality and which energization mechanisms are most important. Inter-model comparisons will illustrate how the different outflow specifications affect the magnetosphere. Specifically, it is found that the GPW provides increased heavy ion outflow over a broader spatial range than the alternative

  16. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  17. Integrating advanced 3D Mapping into Improved Hydrogeologic Frameworks, a Future path for Groundwater Modeling? Results from Western Nebraska

    Science.gov (United States)

    Cannia, J. C.; Abraham, J. D.; Peterson, S. M.; Sibray, S. S.

    2012-12-01

    The U.S. Geological Survey and its partners have collaborated to provide an innovative, advanced 3 dimensional hydrogeologic framework which was used in a groundwater model designed to test water management scenarios. Principal aquifers for the area mostly consist of Quaternary alluvium and Tertiary-age fluvial sediments which are heavily used for irrigation, municipal and environmental uses. This strategy used airborne electromagnetic (AEM) surveys, validated through sensitivity analysis of geophysical and geological ground truth to provide new geologic interpretation to characterize the hydrogeologic framework in the area. The base of aquifer created through this work leads to new interpretations of saturated thickness and groundwater connectivity to the surface water system. The current version of the groundwater model which uses the advanced hydrogeologic framework shows a distinct change in flow path orientation, timing and amount of base flow to the streams of the area. Ongoing efforts for development of the hydrogeologic framework development include subdivision of the aquifers into new hydrostratigraphic units based on analysis of geophysical and lithologic characteristics which will be incorporated into future groundwater models. The hydrostratigraphic units are further enhanced by Nuclear Magnetic Resonance (NMR) measurements to characterize aquifers. NMR measures the free water in the aquifer in situ allowing for a determination of hydraulic conductivity. NMR hydraulic conductivity values will be mapped to the hydrostratigraphic units, which in turn are incorporated into the latest versions of the groundwater model. The addition of innovative, advanced 3 dimensional hydrogeologic frameworks, which incorporates AEM and NMR, for groundwater modeling, has a definite advantage over traditional frameworks. These groundwater models represent the natural system at a level of reality not achievable by other methods, which lead to greater confidence in the

  18. Digital elevation models for site investigation programme in Oskarshamn. Site description version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Brydsten, Lars; Stroemgren, Maarten [Umeaa Univ. (Sweden). Dept. of Biology and Environmental Science

    2005-06-01

    In the Oskarshamn area, a digital elevation model has been produced using elevation data from many elevation sources on both land and sea. Many elevation model users are only interested in elevation models over land, so the model has been designed in three versions: Version 1 describes land surface, lake water surface, and sea bottom. Version 2 describes land surface, sediment levels at lake bottoms, and sea bottoms. Version 3 describes land surface, sediment levels at lake bottoms, and sea surface. In cases where the different sources of data were not in point form 'such as existing elevation models of land or depth lines from nautical charts' they have been converted to point values using GIS software. Because data from some sources often overlaps with data from other sources, several tests were conducted to determine if both sources of data or only one source would be included in the dataset used for the interpolation procedure. The tests resulted in the decision to use only the source judged to be of highest quality for most areas with overlapping data sources. All data were combined into a database of approximately 3.3 million points unevenly spread over an area of about 800 km{sup 2}. The large number of data points made it difficult to construct the model with a single interpolation procedure, the area was divided into 28 sub-models that were processed one by one and finally merged together into one single model. The software ArcGis 8.3 and its extension Geostatistical Analysis were used for the interpolation. The Ordinary Kriging method was used for interpolation. This method allows both a cross validation and a validation before the interpolation is conducted. Cross validation with different Kriging parameters were performed and the model with the most reasonable statistics was chosen. Finally, a validation with the most appropriate Kriging parameters was performed in order to verify that the model fit unmeasured localities. Since both the

  19. Digital elevation models for site investigation programme in Oskarshamn. Site description version 1.2

    International Nuclear Information System (INIS)

    Brydsten, Lars; Stroemgren, Maarten

    2005-06-01

    In the Oskarshamn area, a digital elevation model has been produced using elevation data from many elevation sources on both land and sea. Many elevation model users are only interested in elevation models over land, so the model has been designed in three versions: Version 1 describes land surface, lake water surface, and sea bottom. Version 2 describes land surface, sediment levels at lake bottoms, and sea bottoms. Version 3 describes land surface, sediment levels at lake bottoms, and sea surface. In cases where the different sources of data were not in point form 'such as existing elevation models of land or depth lines from nautical charts' they have been converted to point values using GIS software. Because data from some sources often overlaps with data from other sources, several tests were conducted to determine if both sources of data or only one source would be included in the dataset used for the interpolation procedure. The tests resulted in the decision to use only the source judged to be of highest quality for most areas with overlapping data sources. All data were combined into a database of approximately 3.3 million points unevenly spread over an area of about 800 km 2 . The large number of data points made it difficult to construct the model with a single interpolation procedure, the area was divided into 28 sub-models that were processed one by one and finally merged together into one single model. The software ArcGis 8.3 and its extension Geostatistical Analysis were used for the interpolation. The Ordinary Kriging method was used for interpolation. This method allows both a cross validation and a validation before the interpolation is conducted. Cross validation with different Kriging parameters were performed and the model with the most reasonable statistics was chosen. Finally, a validation with the most appropriate Kriging parameters was performed in order to verify that the model fit unmeasured localities. Since both the quality and the

  20. Theoretical Models and Operational Frameworks in Public Health Ethics

    Science.gov (United States)

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  1. Theoretical Models and Operational Frameworks in Public Health Ethics

    Directory of Open Access Journals (Sweden)

    Carlo Petrini

    2010-01-01

    Full Text Available The article is divided into three sections: (i an overview of the main ethical models in public health (theoretical foundations; (ii a summary of several published frameworks for public health ethics (practical frameworks; and (iii a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided.

  2. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...... control with the building designer. Consequence based design is defined by the specific use of integrated dynamic modeling, which includes the parametric capabilities of a scripting tool and building simulation features of a building performance simulation tool. The framework can lead to enhanced...

  3. A framework for quantifying net benefits of alternative prognostic models

    OpenAIRE

    Rapsomaniki, E.; White, I.R.; Wood, A.M.; Thompson, S.G.; Ford, I.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measure...

  4. Thermal modelling. Preliminary site description. Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2005-08-01

    This report presents the thermal site descriptive model for the Forsmark area, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at canister scale has been modelled for two different lithological domains (RFM029 and RFM012, both dominated by granite to granodiorite (101057)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Two alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Forsmark area, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. Results indicate that the mean of thermal conductivity is expected to exhibit a small variation between the different domains, 3.46 W/(mxK) for RFM012 to 3.55 W/(mxK) for RFM029. The spatial distribution of the thermal conductivity does not follow a simple model. Lower and upper 95% confidence limits are based on the modelling results, but have been rounded of to only two significant figures. Consequently, the lower limit is 2.9 W/(mxK), while the upper is 3.8 W/(mxK). This is applicable to both the investigated domains. The temperature dependence is rather small with a decrease in thermal conductivity of 10.0% per 100 deg C increase in temperature for the dominating rock type. There are a number of important uncertainties associated with these results. One of the uncertainties considers the representative scale for the canister. Another important uncertainty is the methodological uncertainties associated with the upscaling of thermal conductivity from cm-scale to canister scale. In addition, the representativeness of rock samples is

  5. Advancing Integrated Systems Modelling Framework for Life Cycle Sustainability Assessment

    Directory of Open Access Journals (Sweden)

    Anthony Halog

    2011-02-01

    Full Text Available The need for integrated methodological framework for sustainability assessment has been widely discussed and is urgent due to increasingly complex environmental system problems. These problems have impacts on ecosystems and human well-being which represent a threat to economic performance of countries and corporations. Integrated assessment crosses issues; spans spatial and temporal scales; looks forward and backward; and incorporates multi-stakeholder inputs. This study aims to develop an integrated methodology by capitalizing the complementary strengths of different methods used by industrial ecologists and biophysical economists. The computational methodology proposed here is systems perspective, integrative, and holistic approach for sustainability assessment which attempts to link basic science and technology to policy formulation. The framework adopts life cycle thinking methods—LCA, LCC, and SLCA; stakeholders analysis supported by multi-criteria decision analysis (MCDA; and dynamic system modelling. Following Pareto principle, the critical sustainability criteria, indicators and metrics (i.e., hotspots can be identified and further modelled using system dynamics or agent based modelling and improved by data envelopment analysis (DEA and sustainability network theory (SNT. The framework is being applied to development of biofuel supply chain networks. The framework can provide new ways of integrating knowledge across the divides between social and natural sciences as well as between critical and problem-solving research.

  6. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    Energy Technology Data Exchange (ETDEWEB)

    Back, Paer-Erik; Sundberg, Jan [Geo Innova AB (Sweden)

    2007-09-15

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  7. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    International Nuclear Information System (INIS)

    Back, Paer-Erik; Sundberg, Jan

    2007-09-01

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  8. COMODI: an ontology to characterise differences in versions of computational models in biology.

    Science.gov (United States)

    Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf

    2016-07-11

    Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to different model versions. Taken together, the underlying changes reflect a model's provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in models. The ontology can be used by scientists and within software to characterise model updates at the level of single changes. When studying or reusing a model, these annotations help with determining the relevance of a change in a given context. We manually studied changes in selected models from BioModels and the Physiome Model Repository. Using the BiVeS tool for difference detection, we then performed an automatic analysis of changes in all models published in these repositories. The resulting set of concepts led us to define candidate terms for the ontology. In a final step, we aggregated and classified these terms and built the first version of the ontology. We present COMODI, an ontology needed because COmputational MOdels DIffer. It empowers users and software to describe changes in a model on the semantic level. COMODI also enables software to implement user-specific filter options for the display of model changes. Finally, COMODI is a step towards predicting how a change in a model influences the simulation results. COMODI, coupled with our algorithm for difference detection, ensures the transparency of a model's evolution, and it enhances the traceability of updates and error corrections. COMODI is encoded in OWL. It is openly available at http://comodi.sems.uni-rostock.de/ .

  9. Main modelling features of the ASTEC V2.1 major version

    International Nuclear Information System (INIS)

    Chatelard, P.; Belon, S.; Bosland, L.; Carénini, L.; Coindreau, O.; Cousin, F.; Marchetto, C.; Nowack, H.; Piar, L.; Chailan, L.

    2016-01-01

    Highlights: • Recent modelling improvements of the ASTEC European severe accident code are outlined. • Key new physical models now available in the ASTEC V2.1 major version are described. • ASTEC progress towards a multi-design reactor code is illustrated for BWR and PHWR. • ASTEC strong link with the on-going EC CESAM FP7 project is emphasized. • Main remaining modelling issues (on which IRSN efforts are now directing) are given. - Abstract: A new major version of the European severe accident integral code ASTEC, developed by IRSN with some GRS support, was delivered in November 2015 to the ASTEC worldwide community. Main modelling features of this V2.1 version are summarised in this paper. In particular, the in-vessel coupling technique between the reactor coolant system thermal-hydraulics module and the core degradation module has been strongly re-engineered to remove some well-known weaknesses of the former V2.0 series. The V2.1 version also includes new core degradation models specifically addressing BWR and PHWR reactor types, as well as several other physical modelling improvements, notably on reflooding of severely damaged cores, Zircaloy oxidation under air atmosphere, corium coolability during corium concrete interaction and source term evaluation. Moreover, this V2.1 version constitutes the back-bone of the CESAM FP7 project, which final objective is to further improve ASTEC for use in Severe Accident Management analysis of the Gen.II–III nuclear power plants presently under operation or foreseen in near future in Europe. As part of this European project, IRSN efforts to continuously improve both code numerical robustness and computing performances at plant scale as well as users’ tools are being intensified. Besides, ASTEC will continue capitalising the whole knowledge on severe accidents phenomenology by progressively keeping physical models at the state of the art through a regular feed-back from the interpretation of the current and

  10. The development of a sustainable development model framework

    International Nuclear Information System (INIS)

    Hannoura, Alim P.; Cothren, Gianna M.; Khairy, Wael M.

    2006-01-01

    The emergence of the 'sustainable development' concept as a response to the mining of natural resources for the benefit of multinational corporations has advanced the cause of long-term environmental management. A sustainable development model (SDM) framework that is inclusive of the 'whole' natural environment is presented to illustrate the integration of the sustainable development of the 'whole' ecosystem. The ecosystem approach is an inclusive framework that covers the natural environment relevant futures and constraints. These are dynamically interconnected and constitute the determinates of resources development component of the SDM. The second component of the SDM framework is the resources development patterns, i.e., the use of land, water, and atmospheric resources. All of these patterns include practices that utilize environmental resources to achieve a predefined outcome producing waste and by-products that require disposal into the environment. The water quality management practices represent the third component of the framework. These practices are governed by standards, limitations and available disposal means subject to quantity and quality permits. These interconnected standards, practices and permits shape the resulting environmental quality of the ecosystem under consideration. A fourth component, environmental indicators, of the SDM framework provides a measure of the ecosystem productivity and status that may differ based on societal values and culture. The four components of the SDM are interwoven into an outcome assessment process to form the management and feedback models. The concept of Sustainable Development is expressed in the management model as an objective function subject to desired constraints imposing the required bounds for achieving ecosystem sustainability. The development of the objective function and constrains requires monetary values for ecosystem functions, resources development activities and environmental cost. The

  11. GARUSO - Version 1.0. Uncertainty model for multipath ultrasonic transit time gas flow meters

    Energy Technology Data Exchange (ETDEWEB)

    Lunde, Per; Froeysa, Kjell-Eivind; Vestrheim, Magne

    1997-09-01

    This report describes an uncertainty model for ultrasonic transit time gas flow meters configured with parallel chords, and a PC program, GARUSO Version 1.0, implemented for calculation of the meter`s relative expanded uncertainty. The program, which is based on the theoretical uncertainty model, is used to carry out a simplified and limited uncertainty analysis for a 12`` 4-path meter, where examples of input and output uncertainties are given. The model predicts a relative expanded uncertainty for the meter at a level which further justifies today`s increasing tendency to use this type of instruments for fiscal metering of natural gas. 52 refs., 15 figs., 11 tabs.

  12. A multisensor evaluation of the asymmetric convective model, version 2, in southeast Texas.

    Science.gov (United States)

    Kolling, Jenna S; Pleim, Jonathan E; Jeffries, Harvey E; Vizuete, William

    2013-01-01

    There currently exist a number of planetary boundary layer (PBL) schemes that can represent the effects of turbulence in daytime convective conditions, although these schemes remain a large source of uncertainty in meteorology and air quality model simulations. This study evaluates a recently developed combined local and nonlocal closure PBL scheme, the Asymmetric Convective Model, version 2 (ACM2), against PBL observations taken from radar wind profilers, a ground-based lidar, and multiple daytime radiosonde balloon launches. These observations were compared against predictions of PBLs from the Weather Research and Forecasting (WRF) model version 3.1 with the ACM2 PBL scheme option, and the Fifth-Generation Meteorological Model (MM5) version 3.7.3 with the Eta PBL scheme option that is currently being used to develop ozone control strategies in southeast Texas. MM5 and WRF predictions during the regulatory modeling episode were evaluated on their ability to predict the rise and fall of the PBL during daytime convective conditions across southeastern Texas. The MM5 predicted PBLs consistently underpredicted observations, and were also less than the WRF PBL predictions. The analysis reveals that the MM5 predicted a slower rising and shallower PBL not representative of the daytime urban boundary layer. Alternatively, the WRF model predicted a more accurate PBL evolution improving the root mean square error (RMSE), both temporally and spatially. The WRF model also more accurately predicted vertical profiles of temperature and moisture in the lowest 3 km of the atmosphere. Inspection of median surface temperature and moisture time-series plots revealed higher predicted surface temperatures in WRF and more surface moisture in MM5. These could not be attributed to surface heat fluxes, and thus the differences in performance of the WRF and MM5 models are likely due to the PBL schemes. An accurate depiction of the diurnal evolution of the planetary boundary layer (PBL) is

  13. Incorporation of detailed eye model into polygon-mesh versions of ICRP-110 reference phantoms.

    Science.gov (United States)

    Nguyen, Thang Tat; Yeom, Yeon Soo; Kim, Han Sung; Wang, Zhao Jun; Han, Min Cheol; Kim, Chan Hyeong; Lee, Jai Ki; Zankl, Maria; Petoussi-Henss, Nina; Bolch, Wesley E; Lee, Choonsik; Chung, Beom Sun

    2015-11-21

    The dose coefficients for the eye lens reported in ICRP 2010 Publication 116 were calculated using both a stylized model and the ICRP-110 reference phantoms, according to the type of radiation, energy, and irradiation geometry. To maintain consistency of lens dose assessment, in the present study we incorporated the ICRP-116 detailed eye model into the converted polygon-mesh (PM) version of the ICRP-110 reference phantoms. After the incorporation, the dose coefficients for the eye lens were calculated and compared with those of the ICRP-116 data. The results showed generally a good agreement between the newly calculated lens dose coefficients and the values of ICRP 2010 Publication 116. Significant differences were found for some irradiation cases due mainly to the use of different types of phantoms. Considering that the PM version of the ICRP-110 reference phantoms preserve the original topology of the ICRP-110 reference phantoms, it is believed that the PM version phantoms, along with the detailed eye model, provide more reliable and consistent dose coefficients for the eye lens.

  14. Incremental testing of the Community Multiscale Air Quality (CMAQ modeling system version 4.7

    Directory of Open Access Journals (Sweden)

    K. M. Foley

    2010-03-01

    Full Text Available This paper describes the scientific and structural updates to the latest release of the Community Multiscale Air Quality (CMAQ modeling system version 4.7 (v4.7 and points the reader to additional resources for further details. The model updates were evaluated relative to observations and results from previous model versions in a series of simulations conducted to incrementally assess the effect of each change. The focus of this paper is on five major scientific upgrades: (a updates to the heterogeneous N2O5 parameterization, (b improvement in the treatment of secondary organic aerosol (SOA, (c inclusion of dynamic mass transfer for coarse-mode aerosol, (d revisions to the cloud model, and (e new options for the calculation of photolysis rates. Incremental test simulations over the eastern United States during January and August 2006 are evaluated to assess the model response to each scientific improvement, providing explanations of differences in results between v4.7 and previously released CMAQ model versions. Particulate sulfate predictions are improved across all monitoring networks during both seasons due to cloud module updates. Numerous updates to the SOA module improve the simulation of seasonal variability and decrease the bias in organic carbon predictions at urban sites in the winter. Bias in the total mass of fine particulate matter (PM2.5 is dominated by overpredictions of unspeciated PM2.5 (PMother in the winter and by underpredictions of carbon in the summer. The CMAQv4.7 model results show slightly worse performance for ozone predictions. However, changes to the meteorological inputs are found to have a much greater impact on ozone predictions compared to changes to the CMAQ modules described here. Model updates had little effect on existing biases in wet deposition predictions.

  15. An Ontology-Based Framework for Modeling User Behavior

    DEFF Research Database (Denmark)

    Razmerita, Liana

    2011-01-01

    and classifies its users according to their behavior. The user ontology is the backbone of OntobUMf and has been designed according to the Information Management System Learning Information Package (IMS LIP). The user ontology includes a Behavior concept that extends IMS LIP specification and defines...... characteristics of the users interacting with the system. Concrete examples of how OntobUMf is used in the context of a Knowledge Management (KM) System are provided. This paper discusses some of the implications of ontology-based user modeling for semantically enhanced KM and, in particular, for personal KM....... The results of this research may contribute to the development of other frameworks for modeling user behavior, other semantically enhanced user modeling frameworks, or other semantically enhanced information systems....

  16. NEW MODEL OF QUALITY ASSESSMENT IN PUBLIC ADMINISTRATION - UPGRADING THE COMMON ASSESSMENT FRAMEWORK (CAF

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2017-01-01

    Full Text Available In our study, we developed new model of quality assessment in public administration. The Common Assessment Framework (CAF is frequently used in continental Europe for this purpose. Its use has many benefits, however we believe its assessment logic is not adequate for public administration. Upgraded version of CAF is conceptually different: instead of analytical and linear CAF we get the instrument that measures organisation as a network of complex processes. Original and upgraded assessment approaches are presented in the paper and compared in the case of self-assessment of selected public administration organisation. The two approaches produced different, sometimes contradictory results. The upgraded model proved to be logically more consistent and it produced higher interpretation capacity.

  17. Open source data assimilation framework for hydrological modeling

    Science.gov (United States)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent

  18. A framework for testing and comparing binaural models.

    Science.gov (United States)

    Dietz, Mathias; Lestang, Jean-Hugues; Majdak, Piotr; Stern, Richard M; Marquardt, Torsten; Ewert, Stephan D; Hartmann, William M; Goodman, Dan F M

    2018-03-01

    Auditory research has a rich history of combining experimental evidence with computational simulations of auditory processing in order to deepen our theoretical understanding of how sound is processed in the ears and in the brain. Despite significant progress in the amount of detail and breadth covered by auditory models, for many components of the auditory pathway there are still different model approaches that are often not equivalent but rather in conflict with each other. Similarly, some experimental studies yield conflicting results which has led to controversies. This can be best resolved by a systematic comparison of multiple experimental data sets and model approaches. Binaural processing is a prominent example of how the development of quantitative theories can advance our understanding of the phenomena, but there remain several unresolved questions for which competing model approaches exist. This article discusses a number of current unresolved or disputed issues in binaural modelling, as well as some of the significant challenges in comparing binaural models with each other and with the experimental data. We introduce an auditory model framework, which we believe can become a useful infrastructure for resolving some of the current controversies. It operates models over the same paradigms that are used experimentally. The core of the proposed framework is an interface that connects three components irrespective of their underlying programming language: The experiment software, an auditory pathway model, and task-dependent decision stages called artificial observers that provide the same output format as the test subject. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Modeling the structure of the attitudes and belief scale 2 using CFA and bifactor approaches: Toward the development of an abbreviated version.

    Science.gov (United States)

    Hyland, Philip; Shevlin, Mark; Adamson, Gary; Boduszek, Daniel

    2014-01-01

    The Attitudes and Belief Scale-2 (ABS-2: DiGiuseppe, Leaf, Exner, & Robin, 1988. The development of a measure of rational/irrational thinking. Paper presented at the World Congress of Behavior Therapy, Edinburg, Scotland.) is a 72-item self-report measure of evaluative rational and irrational beliefs widely used in Rational Emotive Behavior Therapy research contexts. However, little psychometric evidence exists regarding the measure's underlying factor structure. Furthermore, given the length of the ABS-2 there is a need for an abbreviated version that can be administered when there are time demands on the researcher, such as in clinical settings. This study sought to examine a series of theoretical models hypothesized to represent the latent structure of the ABS-2 within an alternative models framework using traditional confirmatory factor analysis as well as utilizing a bifactor modeling approach. Furthermore, this study also sought to develop a psychometrically sound abbreviated version of the ABS-2. Three hundred and thirteen (N = 313) active emergency service personnel completed the ABS-2. Results indicated that for each model, the application of bifactor modeling procedures improved model fit statistics, and a novel eight-factor intercorrelated solution was identified as the best fitting model of the ABS-2. However, the observed fit indices failed to satisfy commonly accepted standards. A 24-item abbreviated version was thus constructed and an intercorrelated eight-factor solution yielded satisfactory model fit statistics. Current results support the use of a bifactor modeling approach to determining the factor structure of the ABS-2. Furthermore, results provide empirical support for the psychometric properties of the newly developed abbreviated version.

  20. The ACTIVE conceptual framework as a structural equation model

    Science.gov (United States)

    Gross, Alden L.; Payne, Brennan R.; Casanova, Ramon; Davoudzadeh, Pega; Dzierzewski, Joseph M.; Farias, Sarah; Giovannetti, Tania; Ip, Edward H.; Marsiske, Michael; Rebok, George W.; Schaie, K. Warner; Thomas, Kelsey; Willis, Sherry; Jones, Richard N.

    2018-01-01

    Background/Study Context Conceptual frameworks are analytic models at a high level of abstraction. Their operationalization can inform randomized trial design and sample size considerations. Methods The Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) conceptual framework was empirically tested using structural equation modeling (N=2,802). ACTIVE was guided by a conceptual framework for cognitive training in which proximal cognitive abilities (memory, inductive reasoning, speed of processing) mediate treatment-related improvement in primary outcomes (everyday problem-solving, difficulty with activities of daily living, everyday speed, driving difficulty), which in turn lead to improved secondary outcomes (health-related quality of life, health service utilization, mobility). Measurement models for each proximal, primary, and secondary outcome were developed and tested using baseline data. Each construct was then combined in one model to evaluate fit (RMSEA, CFI, normalized residuals of each indicator). To expand the conceptual model and potentially inform future trials, evidence of modification of structural model parameters was evaluated by age, years of education, sex, race, and self-rated health status. Results Preconceived measurement models for memory, reasoning, speed of processing, everyday problem-solving, instrumental activities of daily living (IADL) difficulty, everyday speed, driving difficulty, and health-related quality of life each fit well to the data (all RMSEA .95). Fit of the full model was excellent (RMSEA = .038; CFI = .924). In contrast with previous findings from ACTIVE regarding who benefits from training, interaction testing revealed associations between proximal abilities and primary outcomes are stronger on average by nonwhite race, worse health, older age, and less education (p conceptual model. Findings suggest that the types of people who show intervention effects on cognitive performance potentially may be

  1. The ACTIVE conceptual framework as a structural equation model.

    Science.gov (United States)

    Gross, Alden L; Payne, Brennan R; Casanova, Ramon; Davoudzadeh, Pega; Dzierzewski, Joseph M; Farias, Sarah; Giovannetti, Tania; Ip, Edward H; Marsiske, Michael; Rebok, George W; Schaie, K Warner; Thomas, Kelsey; Willis, Sherry; Jones, Richard N

    2018-01-01

    Background/Study Context: Conceptual frameworks are analytic models at a high level of abstraction. Their operationalization can inform randomized trial design and sample size considerations. The Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) conceptual framework was empirically tested using structural equation modeling (N=2,802). ACTIVE was guided by a conceptual framework for cognitive training in which proximal cognitive abilities (memory, inductive reasoning, speed of processing) mediate treatment-related improvement in primary outcomes (everyday problem-solving, difficulty with activities of daily living, everyday speed, driving difficulty), which in turn lead to improved secondary outcomes (health-related quality of life, health service utilization, mobility). Measurement models for each proximal, primary, and secondary outcome were developed and tested using baseline data. Each construct was then combined in one model to evaluate fit (RMSEA, CFI, normalized residuals of each indicator). To expand the conceptual model and potentially inform future trials, evidence of modification of structural model parameters was evaluated by age, years of education, sex, race, and self-rated health status. Preconceived measurement models for memory, reasoning, speed of processing, everyday problem-solving, instrumental activities of daily living (IADL) difficulty, everyday speed, driving difficulty, and health-related quality of life each fit well to the data (all RMSEA .95). Fit of the full model was excellent (RMSEA = .038; CFI = .924). In contrast with previous findings from ACTIVE regarding who benefits from training, interaction testing revealed associations between proximal abilities and primary outcomes are stronger on average by nonwhite race, worse health, older age, and less education (p conceptual model. Findings suggest that the types of people who show intervention effects on cognitive performance potentially may be different from

  2. A VGI data integration framework based on linked data model

    Science.gov (United States)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  3. A Structural Model Decomposition Framework for Systems Health Management

    Science.gov (United States)

    Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino

    2013-01-01

    Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.

  4. A structural model decomposition framework for systems health management

    Science.gov (United States)

    Roychoudhury, I.; Daigle, M.; Bregon, A.; Pulido, B.

    Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.

  5. Statistical model of fractures and deformation zones. Preliminary site description, Laxemar subarea, version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hermanson, Jan; Forssberg, Ola [Golder Associates AB, Stockholm (Sweden); Fox, Aaron; La Pointe, Paul [Golder Associates Inc., Redmond, WA (United States)

    2005-10-15

    The goal of this summary report is to document the data sources, software tools, experimental methods, assumptions, and model parameters in the discrete-fracture network (DFN) model for the local model volume in Laxemar, version 1.2. The model parameters presented herein are intended for use by other project modeling teams. Individual modeling teams may elect to simplify or use only a portion of the DFN model, depending on their needs. This model is not intended to be a flow model or a mechanical model; as such, only the geometrical characterization is presented. The derivations of the hydraulic or mechanical properties of the fractures or their subsurface connectivities are not within the scope of this report. This model represents analyses carried out on particular data sets. If additional data are obtained, or values for existing data are changed or excluded, the conclusions reached in this report, and the parameter values calculated, may change as well. The model volume is divided into two subareas; one located on the Simpevarp peninsula adjacent to the power plant (Simpevarp), and one further to the west (Laxemar). The DFN parameters described in this report were determined by analysis of data collected within the local model volume. As such, the final DFN model is only valid within this local model volume and the modeling subareas (Laxemar and Simpevarp) within.

  6. Aerosol-cloud interactions in a multi-scale modeling framework

    Science.gov (United States)

    Lin, G.; Ghan, S. J.

    2017-12-01

    Atmospheric aerosols play an important role in changing the Earth's climate through scattering/absorbing solar and terrestrial radiation and interacting with clouds. However, quantification of the aerosol effects remains one of the most uncertain aspects of current and future climate projection. Much of the uncertainty results from the multi-scale nature of aerosol-cloud interactions, which is very challenging to represent in traditional global climate models (GCMs). In contrast, the multi-scale modeling framework (MMF) provides a viable solution, which explicitly resolves the cloud/precipitation in the cloud resolved model (CRM) embedded in the GCM grid column. In the MMF version of community atmospheric model version 5 (CAM5), aerosol processes are treated with a parameterization, called the Explicit Clouds Parameterized Pollutants (ECPP). It uses the cloud/precipitation statistics derived from the CRM to treat the cloud processing of aerosols on the GCM grid. However, this treatment treats clouds on the CRM grid but aerosols on the GCM grid, which is inconsistent with the reality that cloud-aerosol interactions occur on the cloud scale. To overcome the limitation, here, we propose a new aerosol treatment in the MMF: Explicit Clouds Explicit Aerosols (ECEP), in which we resolve both clouds and aerosols explicitly on the CRM grid. We first applied the MMF with ECPP to the Accelerated Climate Modeling for Energy (ACME) model to have an MMF version of ACME. Further, we also developed an alternative version of ACME-MMF with ECEP. Based on these two models, we have conducted two simulations: one with the ECPP and the other with ECEP. Preliminary results showed that the ECEP simulations tend to predict higher aerosol concentrations than ECPP simulations, because of the more efficient vertical transport from the surface to the higher atmosphere but the less efficient wet removal. We also found that the cloud droplet number concentrations are also different between the

  7. A computationally efficient description of heterogeneous freezing: A simplified version of the Soccer ball model

    Science.gov (United States)

    Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank

    2014-01-01

    In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.

  8. Integrating knowledge seeking into knowledge management models and frameworks

    Directory of Open Access Journals (Sweden)

    Francois Lottering

    2012-02-01

    Full Text Available Background: A striking feature of the knowledge management (KM literature is that the standard list of KM processes either subsumes or overlooks the process of knowledge seeking. Knowledge seeking is manifestly under-theorised, making the need to address this gap in KM theory and practice clear and urgent.Objectives: This article investigates the theoretical status of the knowledge-seeking process in extant KM models and frameworks. It also statistically describes knowledge seeking and knowledge sharing practices in a sample of South African companies. Using this data, it proposes a KM model based on knowledge seeking.Method: Knowledge seeking is traced in a number of KM models and frameworks with a specific focus on Han Lai and Margaret Graham’s adapted KM cycle model, which separates knowledge seeking from knowledge sharing. This empirical investigation used a questionnaire to examine knowledge seeking and knowledge sharing practices in a sample of South African companies.Results: This article critiqued and elaborated on the adapted KM cycle model of Lai and Graham. It identified some of the key features of knowledge seeking practices in the workplace. It showed that knowledge seeking and sharing are human-centric actions and that seeking knowledge uses trust and loyalty as its basis. It also showed that one cannot separate knowledge seeking from knowledge sharing.Conclusion: The knowledge seeking-based KM model elaborates on Lai and Graham’s model. It provides insight into how and where people seek and share knowledge in the workplace. The article concludes that it is necessary to cement the place of knowledge seeking in KM models as well as frameworks and suggests that organisations should apply its findings to improving their knowledge management strategies. 

  9. Community Land Model Version 3.0 (CLM3.0) Developer's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, FM

    2004-12-21

    This document describes the guidelines adopted for software development of the Community Land Model (CLM) and serves as a reference to the entire code base of the released version of the model. The version of the code described here is Version 3.0 which was released in the summer of 2004. This document, the Community Land Model Version 3.0 (CLM3.0) User's Guide (Vertenstein et al., 2004), the Technical Description of the Community Land Model (CLM) (Oleson et al., 2004), and the Community Land Model's Dynamic Global Vegetation Model (CLM-DGVM): Technical Description and User's Guide (Levis et al., 2004) provide the developer, user, or researcher with details of implementation, instructions for using the model, a scientific description of the model, and a scientific description of the Dynamic Global Vegetation Model integrated with CLM respectively. The CLM is a single column (snow-soil-vegetation) biogeophysical model of the land surface which can be run serially (on a laptop or personal computer) or in parallel (using distributed or shared memory processors or both) on both vector and scalar computer architectures. Written in Fortran 90, CLM can be run offline (i.e., run in isolation using stored atmospheric forcing data), coupled to an atmospheric model (e.g., the Community Atmosphere Model (CAM)), or coupled to a climate system model (e.g., the Community Climate System Model Version 3 (CCSM3)) through a flux coupler (e.g., Coupler 6 (CPL6)). When coupled, CLM exchanges fluxes of energy, water, and momentum with the atmosphere. The horizontal land surface heterogeneity is represented by a nested subgrid hierarchy composed of gridcells, landunits, columns, and plant functional types (PFTs). This hierarchical representation is reflected in the data structures used by the model code. Biophysical processes are simulated for each subgrid unit (landunit, column, and PFT) independently, and prognostic variables are maintained for each subgrid unit

  10. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    Science.gov (United States)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  11. A Modelling Framework for Conventional and Heat Integrated Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2013-01-01

    of hydrocarbons such as separations of equimolar mixtures of benzene/toluene or propane/propene described by simple models, a generic, modular, model framework is presented in this work. At present, the framework is able to describe a conventional distillation column, a mechanical vapor recompression column......Diabatic operation of distillation columns can lead to signicant reductions in energy utilization and operation cost compared to conventional (adiabatic) distillation columns, at an expense of an increased complexity of design and operation. The earliest diabatic distillation conguration dates back...... to the late 70s, and various dierent congurations have appeared since. However, at present, no full-scale diabatic distillation columns are currently operating in the industry. Current studies related to alternative distillation congurations report very dierent gures for potential energy savings which...

  12. The modified version of the centre-of-mass correction to the bag model

    International Nuclear Information System (INIS)

    Bartelski, J.; Tatur, S.

    1986-01-01

    We propose the improvement of the recently considered version of the centre-of-mass correction to the bag model. We identify a nucleon bag with physical nucleon confined in an external fictitious spherical well potential with an additional external fictitious pressure characterized by the parameter b. The introduction of such a pressure restores the conservation of the canonical energy-momentum tensor, which was lost in the former model. We propose several methods to determine the numerical value of b. We calculate the Roper resonance mass as well as static electroweak parameters of a nucleon with centre-of-mass corrections taken into account. 7 refs., 1 tab. (author)

  13. A Systematic Modelling Framework for Phase Transfer Catalyst Systems

    DEFF Research Database (Denmark)

    Anantpinijwatna, Amata; Sales-Cruz, Mauricio; Hyung Kim, Sun

    2016-01-01

    Phase-transfer catalyst systems contain two liquid phases, with a catalyst (PTC) that transfers between the phases, driving product formation in one phase and being regenerated in the other phase. Typically the reaction involves neutral species in an organic phase and regeneration involves ions i....... The application of the framework is made to two cases in order to highlight the performance and issues of activity coefficient models for predicting design and operation and the effects when different organic solvents are employed....

  14. Common and Innovative Visuals: A sparsity modeling framework for video.

    Science.gov (United States)

    Abdolhosseini Moghadam, Abdolreza; Kumar, Mrityunjay; Radha, Hayder

    2014-05-02

    Efficient video representation models are critical for many video analysis and processing tasks. In this paper, we present a framework based on the concept of finding the sparsest solution to model video frames. To model the spatio-temporal information, frames from one scene are decomposed into two components: (i) a common frame, which describes the visual information common to all the frames in the scene/segment, and (ii) a set of innovative frames, which depicts the dynamic behaviour of the scene. The proposed approach exploits and builds on recent results in the field of compressed sensing to jointly estimate the common frame and the innovative frames for each video segment. We refer to the proposed modeling framework by CIV (Common and Innovative Visuals). We show how the proposed model can be utilized to find scene change boundaries and extend CIV to videos from multiple scenes. Furthermore, the proposed model is robust to noise and can be used for various video processing applications without relying on motion estimation and detection or image segmentation. Results for object tracking, video editing (object removal, inpainting) and scene change detection are presented to demonstrate the efficiency and the performance of the proposed model.

  15. MESOI Version 2.0: an interactive mesoscale Lagrangian puff dispersion model with deposition and decay

    International Nuclear Information System (INIS)

    Ramsdell, J.V.; Athey, G.F.; Glantz, C.S.

    1983-11-01

    MESOI Version 2.0 is an interactive Lagrangian puff model for estimating the transport, diffusion, deposition and decay of effluents released to the atmosphere. The model is capable of treating simultaneous releases from as many as four release points, which may be elevated or at ground-level. The puffs are advected by a horizontal wind field that is defined in three dimensions. The wind field may be adjusted for expected topographic effects. The concentration distribution within the puffs is initially assumed to be Gaussian in the horizontal and vertical. However, the vertical concentration distribution is modified by assuming reflection at the ground and the top of the atmospheric mixing layer. Material is deposited on the surface using a source depletion, dry deposition model and a washout coefficient model. The model also treats the decay of a primary effluent species and the ingrowth and decay of a single daughter species using a first order decay process. This report is divided into two parts. The first part discusses the theoretical and mathematical bases upon which MESOI Version 2.0 is based. The second part contains the MESOI computer code. The programs were written in the ANSI standard FORTRAN 77 and were developed on a VAX 11/780 computer. 43 references, 14 figures, 13 tables

  16. A p-version embedded model for simulation of concrete temperature fields with cooling pipes

    Directory of Open Access Journals (Sweden)

    Sheng Qiang

    2015-07-01

    Full Text Available Pipe cooling is an effective method of mass concrete temperature control, but its accurate and convenient numerical simulation is still a cumbersome problem. An improved embedded model, considering the water temperature variation along the pipe, was proposed for simulating the temperature field of early-age concrete structures containing cooling pipes. The improved model was verified with an engineering example. Then, the p-version self-adaption algorithm for the improved embedded model was deduced, and the initial values and boundary conditions were examined. Comparison of some numerical samples shows that the proposed model can provide satisfying precision and a higher efficiency. The analysis efficiency can be doubled at the same precision, even for a large-scale element. The p-version algorithm can fit grids of different sizes for the temperature field simulation. The convenience of the proposed algorithm lies in the possibility of locating more pipe segments in one element without the need of so regular a shape as in the explicit model.

  17. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    Science.gov (United States)

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  18. A statistical learning framework for groundwater nitrate models of the Central Valley, California, USA

    Science.gov (United States)

    Nolan, Bernard T.; Fienen, Michael N.; Lorenz, David L.

    2015-01-01

    We used a statistical learning framework to evaluate the ability of three machine-learning methods to predict nitrate concentration in shallow groundwater of the Central Valley, California: boosted regression trees (BRT), artificial neural networks (ANN), and Bayesian networks (BN). Machine learning methods can learn complex patterns in the data but because of overfitting may not generalize well to new data. The statistical learning framework involves cross-validation (CV) training and testing data and a separate hold-out data set for model evaluation, with the goal of optimizing predictive performance by controlling for model overfit. The order of prediction performance according to both CV testing R2 and that for the hold-out data set was BRT > BN > ANN. For each method we identified two models based on CV testing results: that with maximum testing R2 and a version with R2 within one standard error of the maximum (the 1SE model). The former yielded CV training R2 values of 0.94–1.0. Cross-validation testing R2 values indicate predictive performance, and these were 0.22–0.39 for the maximum R2 models and 0.19–0.36 for the 1SE models. Evaluation with hold-out data suggested that the 1SE BRT and ANN models predicted better for an independent data set compared with the maximum R2 versions, which is relevant to extrapolation by mapping. Scatterplots of predicted vs. observed hold-out data obtained for final models helped identify prediction bias, which was fairly pronounced for ANN and BN. Lastly, the models were compared with multiple linear regression (MLR) and a previous random forest regression (RFR) model. Whereas BRT results were comparable to RFR, MLR had low hold-out R2 (0.07) and explained less than half the variation in the training data. Spatial patterns of predictions by the final, 1SE BRT model agreed reasonably well with previously observed patterns of nitrate occurrence in groundwater of the Central Valley.

  19. Designing Collaborative Developmental Standards by Refactoring of the Earth Science Models, Libraries, Workflows and Frameworks.

    Science.gov (United States)

    Mirvis, E.; Iredell, M.

    2015-12-01

    The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the

  20. An Active Lattice Model in a Bayesian Framework

    DEFF Research Database (Denmark)

    Carstensen, Jens Michael

    1996-01-01

    A Markov Random Field is used as a structural model of a deformable rectangular lattice. When used as a template prior in a Bayesian framework this model is powerful for making inferences about lattice structures in images. The model assigns maximum probability to the perfect regular lattice...... by penalizing deviations in alignment and lattice node distance. The Markov random field represents prior knowledge about the lattice structure, and through an observation model that incorporates the visual appearance of the nodes, we can simulate realizations from the posterior distribution. A maximum...... a posteriori (MAP) estimate, found by simulated annealing, is used as the reconstructed lattice. The model was developed as a central part of an algorithm for automatic analylsis of genetic experiments, positioned in a lattice structure by a robot. The algorithm has been successfully applied to many images...

  1. How much cryosphere model complexity is just right? Exploration using the conceptual cryosphere hydrology framework

    Directory of Open Access Journals (Sweden)

    T. M. Mosier

    2016-09-01

    Full Text Available Making meaningful projections of the impacts that possible future climates would have on water resources in mountain regions requires understanding how cryosphere hydrology model performance changes under altered climate conditions and when the model is applied to ungaged catchments. Further, if we are to develop better models, we must understand which specific process representations limit model performance. This article presents a modeling tool, named the Conceptual Cryosphere Hydrology Framework (CCHF, that enables implementing and evaluating a wide range of cryosphere modeling hypotheses. The CCHF represents cryosphere hydrology systems using a set of coupled process modules that allows easily interchanging individual module representations and includes analysis tools to evaluate model outputs. CCHF version 1 (Mosier, 2016 implements model formulations that require only precipitation and temperature as climate inputs – for example variations on simple degree-index (SDI or enhanced temperature index (ETI formulations – because these model structures are often applied in data-sparse mountain regions, and perform relatively well over short periods, but their calibration is known to change based on climate and geography. Using CCHF, we implement seven existing and novel models, including one existing SDI model, two existing ETI models, and four novel models that utilize a combination of existing and novel module representations. The novel module representations include a heat transfer formulation with net longwave radiation and a snowpack internal energy formulation that uses an approximation of the cold content. We assess the models for the Gulkana and Wolverine glaciated watersheds in Alaska, which have markedly different climates and contain long-term US Geological Survey benchmark glaciers. Overall we find that the best performing models are those that are more physically consistent and representative, but no single model performs

  2. Description of the new version 4.0 of the tritium model UFOTRI including user guide

    International Nuclear Information System (INIS)

    Raskob, W.

    1993-08-01

    In view of the future operation of fusion reactors the release of tritium may play a dominant role during normal operation as well as after accidents. Because of its physical and chemical properties which differ significantly from those of other radionuclides, the model UFOTRI for assessing the radiological consequences of accidental tritium releases has been developed. It describes the behaviour of tritium in the biosphere and calculates the radiological impact on individuals and the population due to the direct exposure and by the ingestion pathways. Processes such as the conversion of tritium gas into tritiated water (HTO) in the soil, re-emission after deposition and the conversion of HTO into organically bound tritium, are considered. The use of UFOTRI in its probabilistic mode shows the spectrum of the radiological impact together with the associated probability of occurrence. A first model version was established in 1991. As the ongoing work on investigating the main processes of the tritium behaviour in the environment shows up new results, the model has been improved in several points. The report describes the changes incorporated into the model since 1991. Additionally provides the up-dated user guide for handling the revised UFOTRI version which will be distributed to interested organizations. (orig.) [de

  3. A conceptual and disease model framework for osteoporotic kyphosis.

    Science.gov (United States)

    Bayliss, M; Miltenburger, C; White, M; Alvares, L

    2013-09-01

    This paper presents a multi-method research project to develop a conceptual framework for measuring outcomes in studies of osteoporotic kyphosis. The research involved literature research and qualitative interviews among clinicians who treat patients with kyphosis and among patients with the condition. Kyphosis due to at least one vertebral compression fracture is prevalent among osteoporotic patients, resulting in well-documented symptoms and impact on functioning and well-being. A three-part study led to development of a conceptual measurement framework for comprehensive assessment of symptoms, impact, and treatment benefit for kyphosis. A literature-based disease model (DM) was developed and tested with physicians (n = 10) and patients (n = 10), and FDA guidelines were used to develop a final disease model and a conceptual framework. The DM included signs, symptoms, causes/triggers, exacerbations, and functional status associated with kyphosis. The DM was largely confirmed, but physicians and patients added several concepts related to impact on functioning, and some concepts were not confirmed and removed from the DM. This study confirms the need for more comprehensive assessment of health outcomes in kyphosis, as most current studies omit key concepts.

  4. Simulating the 2012 High Plains Drought Using Three Single Column Model Versions of the Community Earth System Model (SCM-CESM)

    Science.gov (United States)

    Medina, I. D.; Denning, S.

    2014-12-01

    The impact of changes in the frequency and severity of drought on fresh water sustainability is a great concern for many regions of the world. One such location is the High Plains, where the local economy is primarily driven by fresh water withdrawals from the Ogallala Aquifer, which accounts for approximately 30% of total irrigation withdrawals from all U.S. aquifers combined. Modeling studies that focus on the feedback mechanisms that control the climate and eco-hydrology during times of drought are limited in the sense that they use conventional General Circulation Models (GCMs) with grid length scales ranging from one hundred to several hundred kilometers. Additionally, these models utilize crude statistical parameterizations of cloud processes for estimating sub-grid fluxes of heat and moisture and have a poor representation of land surface heterogeneity. For this research, we focus on the 2012 High Plains drought, and will perform numerical simulations using three single column model versions of the Community Earth System Model (SCM-CESM) at multiple sites overlying the Ogallala Aquifer for the 2010-2012 period. In the first version of SCM-CESM, CESM will be used in standard mode (Community Atmospheric Model (CAM) coupled to a single instance of the Community Land Model (CLM)), secondly, CESM will be used in Super-Parameterized mode (SP-CESM), where a cloud resolving model (CRM consists of 32 atmospheric columns) replaces the standard CAM atmospheric parameterization and is coupled to a single instance of CLM, and thirdly, CESM is used in "Multi Instance" SP-CESM mode, where an instance of CLM is coupled to each CRM column of SP-CESM (32 CRM columns coupled to 32 instances of CLM). To assess the physical realism of the land-atmosphere feedbacks simulated at each site by all versions of SCM-CESM, differences in simulated energy and moisture fluxes will be computed between years for the 2010-2012 period, and will be compared to differences calculated using

  5. The ontology model of FrontCRM framework

    Science.gov (United States)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  6. Modeling Geomagnetic Variations using a Machine Learning Framework

    Science.gov (United States)

    Cheung, C. M. M.; Handmer, C.; Kosar, B.; Gerules, G.; Poduval, B.; Mackintosh, G.; Munoz-Jaramillo, A.; Bobra, M.; Hernandez, T.; McGranaghan, R. M.

    2017-12-01

    We present a framework for data-driven modeling of Heliophysics time series data. The Solar Terrestrial Interaction Neural net Generator (STING) is an open source python module built on top of state-of-the-art statistical learning frameworks (traditional machine learning methods as well as deep learning). To showcase the capability of STING, we deploy it for the problem of predicting the temporal variation of geomagnetic fields. The data used includes solar wind measurements from the OMNI database and geomagnetic field data taken by magnetometers at US Geological Survey observatories. We examine the predictive capability of different machine learning techniques (recurrent neural networks, support vector machines) for a range of forecasting times (minutes to 12 hours). STING is designed to be extensible to other types of data. We show how STING can be used on large sets of data from different sensors/observatories and adapted to tackle other problems in Heliophysics.

  7. U.S. Nuclear Regulatory Commission Extremely Low Probability of Rupture pilot study: xLPR framework model user's guide

    International Nuclear Information System (INIS)

    Kalinich, Donald A.; Sallaberry, Cedric M.; Mattie, Patrick D.

    2010-01-01

    For the U.S. Nuclear Regulatory Commission (NRC) Extremely Low Probability of Rupture (xLPR) pilot study, Sandia National Laboratories (SNL) was tasked to develop and evaluate a probabilistic framework using a commercial software package for Version 1.0 of the xLPR Code. Version 1.0 of the xLPR code is focused assessing the probability of rupture due to primary water stress corrosion cracking in dissimilar metal welds in pressurizer surge nozzles. Future versions of this framework will expand the capabilities to other cracking mechanisms, and other piping systems for both pressurized water reactors and boiling water reactors. The goal of the pilot study project is to plan the xLPR framework transition from Version 1.0 to Version 2.0; hence the initial Version 1.0 framework and code development will be used to define the requirements for Version 2.0. The software documented in this report has been developed and tested solely for this purpose. This framework and demonstration problem will be used to evaluate the commercial software's capabilities and applicability for use in creating the final version of the xLPR framework. This report details the design, system requirements, and the steps necessary to use the commercial-code based xLPR framework developed by SNL.

  8. Dynamic Computation of Change Operations in Version Management of Business Process Models

    Science.gov (United States)

    Küster, Jochen Malte; Gerth, Christian; Engels, Gregor

    Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.

  9. Update of the Polar SWIFT model for polar stratospheric ozone loss (Polar SWIFT version 2)

    Science.gov (United States)

    Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2017-07-01

    The Polar SWIFT model is a fast scheme for calculating the chemistry of stratospheric ozone depletion in polar winter. It is intended for use in global climate models (GCMs) and Earth system models (ESMs) to enable the simulation of mutual interactions between the ozone layer and climate. To date, climate models often use prescribed ozone fields, since a full stratospheric chemistry scheme is computationally very expensive. Polar SWIFT is based on a set of coupled differential equations, which simulate the polar vortex-averaged mixing ratios of the key species involved in polar ozone depletion on a given vertical level. These species are O3, chemically active chlorine (ClOx), HCl, ClONO2 and HNO3. The only external input parameters that drive the model are the fraction of the polar vortex in sunlight and the fraction of the polar vortex below the temperatures necessary for the formation of polar stratospheric clouds. Here, we present an update of the Polar SWIFT model introducing several improvements over the original model formulation. In particular, the model is now trained on vortex-averaged reaction rates of the ATLAS Chemistry and Transport Model, which enables a detailed look at individual processes and an independent validation of the different parameterizations contained in the differential equations. The training of the original Polar SWIFT model was based on fitting complete model runs to satellite observations and did not allow for this. A revised formulation of the system of differential equations is developed, which closely fits vortex-averaged reaction rates from ATLAS that represent the main chemical processes influencing ozone. In addition, a parameterization for the HNO3 change by denitrification is included. The rates of change of the concentrations of the chemical species of the Polar SWIFT model are purely chemical rates of change in the new version, whereas in the original Polar SWIFT model, they included a transport effect caused by the

  10. A Structural Model Decomposition Framework for Hybrid Systems Diagnosis

    Science.gov (United States)

    Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil

    2015-01-01

    Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.

  11. Generalized framework for context-specific metabolic model extraction methods

    Directory of Open Access Journals (Sweden)

    Semidán eRobaina Estévez

    2014-09-01

    Full Text Available Genome-scale metabolic models are increasingly applied to investigate the physiology not only of simple prokaryotes, but also eukaryotes, such as plants, characterized with compartmentalized cells of multiple types. While genome-scale models aim at including the entirety of known metabolic reactions, mounting evidence has indicated that only a subset of these reactions is active in a given context, including: developmental stage, cell type, or environment. As a result, several methods have been proposed to reconstruct context-specific models from existing genome-scale models by integrating various types of high-throughput data. Here we present a mathematical framework that puts all existing methods under one umbrella and provides the means to better understand their functioning, highlight similarities and differences, and to help users in selecting a most suitable method for an application.

  12. Modelling multimedia teleservices with OSI upper layers framework: Short paper

    Science.gov (United States)

    Widya, I.; Vanrijssen, E.; Michiels, E.

    The paper presents the use of the concepts and modelling principles of the Open Systems Interconnection (OSI) upper layers structure in the modelling of multimedia teleservices. It puts emphasis on the revised Application Layer Structure (OSI/ALS). OSI/ALS is an object based reference model which intends to coordinate the development of application oriented services and protocols in a consistent and modular way. It enables the rapid deployment and integrated use of these services. The paper emphasizes further on the nesting structure defined in OSI/ALS which allows the design of scalable and user tailorable/controllable teleservices. OSI/ALS consistent teleservices are moreover implementable on communication platforms of different capabilities. An analysis of distributed multimedia architectures which can be found in the literature, confirms the ability of the OSI/ALS framework to model the interworking functionalities of teleservices.

  13. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  14. GOOSE Version 1.4: A powerful object-oriented simulation environment for developing reactor models

    International Nuclear Information System (INIS)

    Nypaver, D.J.; March-Leuba, C.; Abdalla, M.A.; Guimaraes, L.

    1992-01-01

    A prototype software package for a fully interactive Generalized Object-Oriented Simulation Environment (GOOSE) is being developed at Oak Ridge National Laboratory. Dynamic models are easily constructed and tested; fully interactive capabilities allow the user to alter model parameters and complexity without recompilation. This environment provides assess to powerful tools such as numerical integration packages, graphical displays, and online help. In GOOSE, portability has been achieved by creating the environment in Objective-C 1 , which is supported by a variety of platforms including UNIX and DOS. GOOSE Version 1.4 introduces new enhancements like the capability of creating ''initial,'' ''dynamic,'' and ''digital'' methods. The object-oriented approach to simulation used in GOOSE combines the concept of modularity with the additional features of allowing precompilation, optimization, testing, and validation of individual modules. Once a library of classes has been defined and compiled, models can be built and modified without recompilation. GOOSE Version 1.4 is primarily command-line driven

  15. A business model design framework for viability : a business ecosystem approach

    NARCIS (Netherlands)

    D'Souza, Austin; Velthuijsen, Hugo; Wortmann, J.C.; Huitema, George

    2015-01-01

    Purpose: To facilitate the design of viable business models by proposing a novel business model design framework for viability. Design: A design science research method is adopted to develop a business model design framework for viability. The business model design framework for viability is

  16. A RETRAN-02 model of the Sizewell B PCSR design - the Winfrith one-loop model, version 3.0

    International Nuclear Information System (INIS)

    Kinnersly, S.R.

    1983-11-01

    A one-loop RETRAN-02 model of the Sizewell B Pre Construction Safety Report (PCSR) design, set up at Winfrith, is described and documented. The model is suitable for symmetrical pressurised transients. Comparison with data from the Sizewell B PCSR shows that the model is a good representation of that design. Known errors, limitations and deficiencies are described. The mode of storage and maintenance at Winfrith using PROMUS (Program Maintenance and Update System) is noted. It is recommended that users modify the standard data by adding replacement cards to the end so as to aid in identification, use and maintenance of local versions. (author)

  17. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Piccoli, Luciano; Simone, James N; Kowalkowlski, James B; Dubey, Abhishek

    2010-01-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  18. The Systems Biology Markup Language (SBML) Level 3 Package: Qualitative Models, Version 1, Release 1.

    Science.gov (United States)

    Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš

    2015-09-04

    Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.

  19. Spatial Modeling for Resources Framework (SMRF): A modular framework for developing spatial forcing data for snow modeling in mountain basins

    Science.gov (United States)

    Havens, Scott; Marks, Danny; Kormos, Patrick; Hedrick, Andrew

    2017-12-01

    In the Western US and many mountainous regions of the world, critical water resources and climate conditions are difficult to monitor because the observation network is generally very sparse. The critical resource from the mountain snowpack is water flowing into streams and reservoirs that will provide for irrigation, flood control, power generation, and ecosystem services. Water supply forecasting in a rapidly changing climate has become increasingly difficult because of non-stationary conditions. In response, operational water supply managers have begun to move from statistical techniques towards the use of physically based models. As we begin to transition physically based models from research to operational use, we must address the most difficult and time-consuming aspect of model initiation: the need for robust methods to develop and distribute the input forcing data. In this paper, we present a new open source framework, the Spatial Modeling for Resources Framework (SMRF), which automates and simplifies the common forcing data distribution methods. It is computationally efficient and can be implemented for both research and operational applications. We present an example of how SMRF is able to generate all of the forcing data required to a run physically based snow model at 50-100 m resolution over regions of 1000-7000 km2. The approach has been successfully applied in real time and historical applications for both the Boise River Basin in Idaho, USA and the Tuolumne River Basin in California, USA. These applications use meteorological station measurements and numerical weather prediction model outputs as input. SMRF has significantly streamlined the modeling workflow, decreased model set up time from weeks to days, and made near real-time application of a physically based snow model possible.

  20. Exploring uncertainty and model predictive performance concepts via a modular snowmelt-runoff modeling framework

    Science.gov (United States)

    Tyler Jon Smith; Lucy Amanda Marshall

    2010-01-01

    Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...

  1. An Integrated Framework Advancing Membrane Protein Modeling and Design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2015-09-01

    Full Text Available Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1 prediction of free energy changes upon mutation; (2 high-resolution structural refinement; (3 protein-protein docking; and (4 assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design.

  2. A Framework for Bioacoustic Vocalization Analysis Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Ebenezer Out-Nyarko

    2009-11-01

    Full Text Available Using Hidden Markov Models (HMMs as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks.

  3. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  4. A constitutive model for magnetostriction based on thermodynamic framework

    International Nuclear Information System (INIS)

    Ho, Kwangsoo

    2016-01-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto–mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature. - Highlights: • A thermodynamically consistent model is proposed to describe the magneto-mechanical coupling effect. • Internal state variables are introduced to capture the dissipative material response. • The evolution rate of the magnetostrictive strain is derived through thermodynamic and dissipation potentials.

  5. A framework for quantifying net benefits of alternative prognostic models

    DEFF Research Database (Denmark)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit......) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk...... reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple...

  6. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  7. A Framework for Modeling Emerging Diseases to Inform Management.

    Science.gov (United States)

    Russell, Robin E; Katz, Rachel A; Richgels, Katherine L D; Walsh, Daniel P; Grant, Evan H C

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  8. The Application of Architecture Frameworks to Modelling Exploration Operations Costs

    Science.gov (United States)

    Shishko, Robert

    2006-01-01

    Developments in architectural frameworks and system-of-systems thinking have provided useful constructs for systems engineering. DoDAF concepts, language, and formalisms, in particular, provide a natural way of conceptualizing an operations cost model applicable to NASA's space exploration vision. Not all DoDAF products have meaning or apply to a DoDAF inspired operations cost model, but this paper describes how such DoDAF concepts as nodes, systems, and operational activities relate to the development of a model to estimate exploration operations costs. The paper discusses the specific implementation to the Mission Operations Directorate (MOD) operational functions/activities currently being developed and presents an overview of how this powerful representation can apply to robotic space missions as well.

  9. Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3 ...

    Science.gov (United States)

    To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfills in evaluating the economic and financial feasibility of LFG energy project development. In 2014, LMOP developed a public version of the model, LFGcost-Web (Version 3.0), to allow landfill and industry stakeholders to evaluate project feasibility on their own. LFGcost-Web can analyze costs for 12 energy recovery project types. These project costs can be estimated with or without the costs of a gas collection and control system (GCCS). The EPA used select equations from LFGcost-Web to estimate costs of the regulatory options in the 2015 proposed revisions to the MSW Landfills Standards of Performance (also known as New Source Performance Standards) and the Emission Guidelines (herein thereafter referred to collectively as the Landfill Rules). More specifically, equations derived from LFGcost-Web were applied to each landfill expected to be impacted by the Landfill Rules to estimate annualized installed capital costs and annual O&M costs of a gas collection and control system. In addition, after applying the LFGcost-Web equations to the list of landfills expected to require a GCCS in year 2025 as a result of the proposed Landfill Rules, the regulatory analysis evaluated whether electr

  10. A Framework for Uplink Intercell Interference Modeling with Channel-Based Scheduling

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2012-01-01

    This paper presents a novel framework for modeling the uplink intercell interference(ICI) in a multiuser cellular network. The proposed framework assists in quantifying the impact of various fading channel models and state-of-the-art scheduling

  11. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (IBM PC VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  12. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  13. Reconstructions of f(T) gravity from entropy-corrected holographic and new agegraphic dark energy models in power-law and logarithmic versions

    Energy Technology Data Exchange (ETDEWEB)

    Saha, Pameli; Debnath, Ujjal [Indian Institute of Engineering Science and Technology, Department of Mathematics, Howrah (India)

    2016-09-15

    Here, we peruse cosmological usage of the most promising candidates of dark energy in the framework of f(T) gravity theory where T represents the torsion scalar teleparallel gravity. We reconstruct the different f(T) modified gravity models in the spatially flat Friedmann-Robertson-Walker universe according to entropy-corrected versions of the holographic and new agegraphic dark energy models in power-law and logarithmic corrections, which describe an accelerated expansion history of the universe. We conclude that the equation of state parameter of the entropy-corrected models can transit from the quintessence state to the phantom regime as indicated by recent observations or can lie entirely in the phantom region. Also, using these models, we investigate the different areas of the stability with the help of the squared speed of sound. (orig.)

  14. Integrated Medical Model (IMM) Optimization Version 4.0 Functional Improvements

    Science.gov (United States)

    Arellano, John; Young, M.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    The IMMs ability to assess mission outcome risk levels relative to available resources provides a unique capability to provide guidance on optimal operational medical kit and vehicle resources. Post-processing optimization allows IMM to optimize essential resources to improve a specific model outcome such as maximization of the Crew Health Index (CHI), or minimization of the probability of evacuation (EVAC) or the loss of crew life (LOCL). Mass and or volume constrain the optimized resource set. The IMMs probabilistic simulation uses input data on one hundred medical conditions to simulate medical events that may occur in spaceflight, the resources required to treat those events, and the resulting impact to the mission based on specific crew and mission characteristics. Because IMM version 4.0 provides for partial treatment for medical events, IMM Optimization 4.0 scores resources at the individual resource unit increment level as opposed to the full condition-specific treatment set level, as done in version 3.0. This allows the inclusion of as many resources as possible in the event that an entire set of resources called out for treatment cannot satisfy the constraints. IMM Optimization version 4.0 adds capabilities that increase efficiency by creating multiple resource sets based on differing constraints and priorities, CHI, EVAC, or LOCL. It also provides sets of resources that improve mission-related IMM v4.0 outputs with improved performance compared to the prior optimization. The new optimization represents much improved fidelity that will improve the utility of the IMM 4.0 for decision support.

  15. Development of a distributed air pollutant dry deposition modeling framework

    International Nuclear Information System (INIS)

    Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J.

    2012-01-01

    A distributed air pollutant dry deposition modeling system was developed with a geographic information system (GIS) to enhance the functionality of i-Tree Eco (i-Tree, 2011). With the developed system, temperature, leaf area index (LAI) and air pollutant concentration in a spatially distributed form can be estimated, and based on these and other input variables, dry deposition of carbon monoxide (CO), nitrogen dioxide (NO 2 ), sulfur dioxide (SO 2 ), and particulate matter less than 10 microns (PM10) to trees can be spatially quantified. Employing nationally available road network, traffic volume, air pollutant emission/measurement and meteorological data, the developed system provides a framework for the U.S. city managers to identify spatial patterns of urban forest and locate potential areas for future urban forest planting and protection to improve air quality. To exhibit the usability of the framework, a case study was performed for July and August of 2005 in Baltimore, MD. - Highlights: ► A distributed air pollutant dry deposition modeling system was developed. ► The developed system enhances the functionality of i-Tree Eco. ► The developed system employs nationally available input datasets. ► The developed system is transferable to any U.S. city. ► Future planting and protection spots were visually identified in a case study. - Employing nationally available datasets and a GIS, this study will provide urban forest managers in U.S. cities a framework to quantify and visualize urban forest structure and its air pollution removal effect.

  16. Reconfigurable Model Execution in the OpenMDAO Framework

    Science.gov (United States)

    Hwang, John T.

    2017-01-01

    NASA's OpenMDAO framework facilitates constructing complex models and computing their derivatives for multidisciplinary design optimization. Decomposing a model into components that follow a prescribed interface enables OpenMDAO to assemble multidisciplinary derivatives from the component derivatives using what amounts to the adjoint method, direct method, chain rule, global sensitivity equations, or any combination thereof, using the MAUD architecture. OpenMDAO also handles the distribution of processors among the disciplines by hierarchically grouping the components, and it automates the data transfer between components that are on different processors. These features have made OpenMDAO useful for applications in aircraft design, satellite design, wind turbine design, and aircraft engine design, among others. This paper presents new algorithms for OpenMDAO that enable reconfigurable model execution. This concept refers to dynamically changing, during execution, one or more of: the variable sizes, solution algorithm, parallel load balancing, or set of variables-i.e., adding and removing components, perhaps to switch to a higher-fidelity sub-model. Any component can reconfigure at any point, even when running in parallel with other components, and the reconfiguration algorithm presented here performs the synchronized updates to all other components that are affected. A reconfigurable software framework for multidisciplinary design optimization enables new adaptive solvers, adaptive parallelization, and new applications such as gradient-based optimization with overset flow solvers and adaptive mesh refinement. Benchmarking results demonstrate the time savings for reconfiguration compared to setting up the model again from scratch, which can be significant in large-scale problems. Additionally, the new reconfigurability feature is applied to a mission profile optimization problem for commercial aircraft where both the parametrization of the mission profile and the

  17. Validity study of the Beck Anxiety Inventory (Portuguese version by the Rasch Rating Scale model

    Directory of Open Access Journals (Sweden)

    Sónia Quintão

    2013-01-01

    Full Text Available Our objective was to conduct a validation study of the Portuguese version of the Beck Anxiety Inventory (BAI by means of the Rasch Rating Scale Model, and then compare it with the most used scales of anxiety in Portugal. The sample consisted of 1,160 adults (427 men and 733 women, aged 18-82 years old (M=33.39; SD=11.85. Instruments were Beck Anxiety Inventory, State-Trait Anxiety Inventory and Zung Self-Rating Anxiety Scale. It was found that Beck Anxiety Inventory's system of four categories, the data-model fit, and people reliability were adequate. The measure can be considered as unidimensional. Gender and age-related differences were not a threat to the validity. BAI correlated significantly with other anxiety measures. In conclusion, BAI shows good psychometric quality.

  18. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  19. The SGHWR version of the Monte Carlo code W-MONTE. Part 1. The theoretical model

    International Nuclear Information System (INIS)

    Allen, F.R.

    1976-03-01

    W-MONTE provides a multi-group model of neutron transport in the exact geometry of a reactor lattice using Monte Carlo methods. It is currently restricted to uniform axial properties. Material data is normally obtained from a preliminary WIMS lattice calculation in the transport group structure. The SGHWR version has been required for analysis of zero energy experiments and special aspects of power reactor lattices, such as the unmoderated lattice region above the moderator when drained to dump height. Neutron transport is modelled for a uniform infinite lattice, simultaneously treating the cases of no leakage, radial leakage or axial leakage only, and the combined effects of radial and axial leakage. Multigroup neutron balance edits are incorporated for the separate effects of radial and axial leakage to facilitate the analysis of leakage and to provide effective diffusion theory parameters for core representation in reactor cores. (author)

  20. The Gtr-Model a Universal Framework for Quantum-Like Measurements

    Science.gov (United States)

    Aerts, Diederik; Bianchi, Massimiliano Sassoli De

    We present a very general geometrico-dynamical description of physical or more abstract entities, called the general tension-reduction (GTR) model, where not only states, but also measurement-interactions can be represented, and the associated outcome probabilities calculated. Underlying the model is the hypothesis that indeterminism manifests as a consequence of unavoidable uctuations in the experimental context, in accordance with the hidden-measurements interpretation of quantum mechanics. When the structure of the state space is Hilbertian, and measurements are of the universal kind, i.e., are the result of an average over all possible ways of selecting an outcome, the GTR-model provides the same predictions of the Born rule, and therefore provides a natural completed version of quantum mechanics. However, when the structure of the state space is non-Hilbertian and/or not all possible ways of selecting an outcome are available to be actualized, the predictions of the model generally differ from the quantum ones, especially when sequential measurements are considered. Some paradigmatic examples will be discussed, taken from physics and human cognition. Particular attention will be given to some known psychological effects, like question order effects and response replicability, which we show are able to generate non-Hilbertian statistics. We also suggest a realistic interpretation of the GTR-model, when applied to human cognition and decision, which we think could become the generally adopted interpretative framework in quantum cognition research.

  1. Comparison of three ice cloud optical schemes in climate simulations with community atmospheric model version 5

    Science.gov (United States)

    Zhao, Wenjie; Peng, Yiran; Wang, Bin; Yi, Bingqi; Lin, Yanluan; Li, Jiangnan

    2018-05-01

    A newly implemented Baum-Yang scheme for simulating ice cloud optical properties is compared with existing schemes (Mitchell and Fu schemes) in a standalone radiative transfer model and in the global climate model (GCM) Community Atmospheric Model Version 5 (CAM5). This study systematically analyzes the effect of different ice cloud optical schemes on global radiation and climate by a series of simulations with a simplified standalone radiative transfer model, atmospheric GCM CAM5, and a comprehensive coupled climate model. Results from the standalone radiative model show that Baum-Yang scheme yields generally weaker effects of ice cloud on temperature profiles both in shortwave and longwave spectrum. CAM5 simulations indicate that Baum-Yang scheme in place of Mitchell/Fu scheme tends to cool the upper atmosphere and strengthen the thermodynamic instability in low- and mid-latitudes, which could intensify the Hadley circulation and dehydrate the subtropics. When CAM5 is coupled with a slab ocean model to include simplified air-sea interaction, reduced downward longwave flux to surface in Baum-Yang scheme mitigates ice-albedo feedback in the Arctic as well as water vapor and cloud feedbacks in low- and mid-latitudes, resulting in an overall temperature decrease by 3.0/1.4 °C globally compared with Mitchell/Fu schemes. Radiative effect and climate feedback of the three ice cloud optical schemes documented in this study can be referred for future improvements on ice cloud simulation in CAM5.

  2. Immersion freezing by natural dust based on a soccer ball model with the Community Atmospheric Model version 5: climate effects

    Science.gov (United States)

    Wang, Yong; Liu, Xiaohong

    2014-12-01

    We introduce a simplified version of the soccer ball model (SBM) developed by Niedermeier et al (2014 Geophys. Res. Lett. 41 736-741) into the Community Atmospheric Model version 5 (CAM5). It is the first time that SBM is used in an atmospheric model to parameterize the heterogeneous ice nucleation. The SBM, which was simplified for its suitable application in atmospheric models, uses the classical nucleation theory to describe the immersion/condensation freezing by dust in the mixed-phase cloud regime. Uncertain parameters (mean contact angle, standard deviation of contact angle probability distribution, and number of surface sites) in the SBM are constrained by fitting them to recent natural dust (Saharan dust) datasets. With the SBM in CAM5, we investigate the sensitivity of modeled cloud properties to the SBM parameters, and find significant seasonal and regional differences in the sensitivity among the three SBM parameters. Changes of mean contact angle and the number of surface sites lead to changes of cloud properties in Arctic in spring, which could be attributed to the transport of dust ice nuclei to this region. In winter, significant changes of cloud properties induced by these two parameters mainly occur in northern hemispheric mid-latitudes (e.g., East Asia). In comparison, no obvious changes of cloud properties caused by changes of standard deviation can be found in all the seasons. These results are valuable for understanding the heterogeneous ice nucleation behavior, and useful for guiding the future model developments.

  3. Immersion freezing by natural dust based on a soccer ball model with the Community Atmospheric Model version 5: climate effects

    International Nuclear Information System (INIS)

    Wang, Yong; Liu, Xiaohong

    2014-01-01

    We introduce a simplified version of the soccer ball model (SBM) developed by Niedermeier et al (2014 Geophys. Res. Lett. 41 736–741) into the Community Atmospheric Model version 5 (CAM5). It is the first time that SBM is used in an atmospheric model to parameterize the heterogeneous ice nucleation. The SBM, which was simplified for its suitable application in atmospheric models, uses the classical nucleation theory to describe the immersion/condensation freezing by dust in the mixed-phase cloud regime. Uncertain parameters (mean contact angle, standard deviation of contact angle probability distribution, and number of surface sites) in the SBM are constrained by fitting them to recent natural dust (Saharan dust) datasets. With the SBM in CAM5, we investigate the sensitivity of modeled cloud properties to the SBM parameters, and find significant seasonal and regional differences in the sensitivity among the three SBM parameters. Changes of mean contact angle and the number of surface sites lead to changes of cloud properties in Arctic in spring, which could be attributed to the transport of dust ice nuclei to this region. In winter, significant changes of cloud properties induced by these two parameters mainly occur in northern hemispheric mid-latitudes (e.g., East Asia). In comparison, no obvious changes of cloud properties caused by changes of standard deviation can be found in all the seasons. These results are valuable for understanding the heterogeneous ice nucleation behavior, and useful for guiding the future model developments. (letter)

  4. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  5. Incorporating remote sensing-based ET estimates into the Community Land Model version 4.5

    Directory of Open Access Journals (Sweden)

    D. Wang

    2017-07-01

    Full Text Available Land surface models bear substantial biases in simulating surface water and energy budgets despite the continuous development and improvement of model parameterizations. To reduce model biases, Parr et al. (2015 proposed a method incorporating satellite-based evapotranspiration (ET products into land surface models. Here we apply this bias correction method to the Community Land Model version 4.5 (CLM4.5 and test its performance over the conterminous US (CONUS. We first calibrate a relationship between the observational ET from the Global Land Evaporation Amsterdam Model (GLEAM product and the model ET from CLM4.5, and assume that this relationship holds beyond the calibration period. During the validation or application period, a simulation using the default CLM4.5 (CLM is conducted first, and its output is combined with the calibrated observational-vs.-model ET relationship to derive a corrected ET; an experiment (CLMET is then conducted in which the model-generated ET is overwritten with the corrected ET. Using the observations of ET, runoff, and soil moisture content as benchmarks, we demonstrate that CLMET greatly improves the hydrological simulations over most of the CONUS, and the improvement is stronger in the eastern CONUS than the western CONUS and is strongest over the Southeast CONUS. For any specific region, the degree of the improvement depends on whether the relationship between observational and model ET remains time-invariant (a fundamental hypothesis of the Parr et al. (2015 method and whether water is the limiting factor in places where ET is underestimated. While the bias correction method improves hydrological estimates without improving the physical parameterization of land surface models, results from this study do provide guidance for physically based model development effort.

  6. A framework for quantifying net benefits of alternative prognostic models.

    Science.gov (United States)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.

  7. VALIDATION OF THE ASTER GLOBAL DIGITAL ELEVATION MODEL VERSION 3 OVER THE CONTERMINOUS UNITED STATES

    Directory of Open Access Journals (Sweden)

    D. Gesch

    2016-06-01

    Full Text Available The ASTER Global Digital Elevation Model Version 3 (GDEM v3 was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1 in 2009 and GDEM Version 2 (v2 in 2011. The absolute vertical accuracy of GDEM v3 was calculated by comparison with more than 23,000 independent reference geodetic ground control points from the U.S. National Geodetic Survey. The root mean square error (RMSE measured for GDEM v3 is 8.52 meters. This compares with the RMSE of 8.68 meters for GDEM v2. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v3 mean error of −1.20 meters reflects an overall negative bias in GDEM v3. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover type to provide insight into how GDEM v3 performs in various land surface conditions. While the RMSE varies little across cover types (6.92 to 9.25 meters, the mean error (bias does appear to be affected by land cover type, ranging from −2.99 to +4.16 meters across 14 land cover classes. These results indicate that in areas where built or natural aboveground features are present, GDEM v3 is measuring elevations above the ground level, a condition noted in assessments of previous GDEM versions (v1 and v2 and an expected condition given the type of stereo-optical image data collected by ASTER. GDEM v3 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM dataset. In many forested areas, GDEM v3 has elevations that are higher in the canopy than SRTM. The overall validation effort also included an evaluation of the GDEM v3 water mask. In general, the number of distinct water polygons in GDEM v3 is much lower than the number in a reference land cover dataset, but the total areas compare much more closely.

  8. Validation of the ASTER Global Digital Elevation Model version 3 over the conterminous United States

    Science.gov (United States)

    Gesch, Dean B.; Oimoen, Michael J.; Danielson, Jeffrey J.; Meyer, David; Halounova, L; Šafář, V.; Jiang, J.; Olešovská, H.; Dvořáček, P.; Holland, D.; Seredovich, V.A.; Muller, J.P.; Pattabhi Rama Rao, E.; Veenendaal, B.; Mu, L.; Zlatanova, S.; Oberst, J.; Yang, C.P.; Ban, Y.; Stylianidis, S.; Voženílek, V.; Vondráková, A.; Gartner, G.; Remondino, F.; Doytsher, Y.; Percivall, George; Schreier, G.; Dowman, I.; Streilein, A.; Ernst, J.

    2016-01-01

    The ASTER Global Digital Elevation Model Version 3 (GDEM v3) was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1) in 2009 and GDEM Version 2 (v2) in 2011. The absolute vertical accuracy of GDEM v3 was calculated by comparison with more than 23,000 independent reference geodetic ground control points from the U.S. National Geodetic Survey. The root mean square error (RMSE) measured for GDEM v3 is 8.52 meters. This compares with the RMSE of 8.68 meters for GDEM v2. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v3 mean error of −1.20 meters reflects an overall negative bias in GDEM v3. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover type to provide insight into how GDEM v3 performs in various land surface conditions. While the RMSE varies little across cover types (6.92 to 9.25 meters), the mean error (bias) does appear to be affected by land cover type, ranging from −2.99 to +4.16 meters across 14 land cover classes. These results indicate that in areas where built or natural aboveground features are present, GDEM v3 is measuring elevations above the ground level, a condition noted in assessments of previous GDEM versions (v1 and v2) and an expected condition given the type of stereo-optical image data collected by ASTER. GDEM v3 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM) dataset. In many forested areas, GDEM v3 has elevations that are higher in the canopy than SRTM. The overall validation effort also included an evaluation of the GDEM v3 water mask. In general, the number of distinct water polygons in GDEM v3 is much lower than the number in a reference land cover dataset, but the total areas compare much more closely.

  9. Designing for Learning and Play - The Smiley Model as Framework

    DEFF Research Database (Denmark)

    Weitze, Charlotte Lærke

    2016-01-01

    digital games. The Smiley Model inspired and provided a scaffold or a heuristic for the overall gamified learning design –- as well as for the students’ learning game design processes when creating small games turning the learning situation into an engaging experience. The audience for the experiments......This paper presents a framework for designing engaging learning experiences in games – the Smiley Model. In this Design-Based Research project, student-game-designers were learning inside a gamified learning design - while designing and implementing learning goals from curriculum into the small...... was adult upper secondary general students as well as 7th grade primary school students. The intention with this article is to inspire future learning designers that would like to experiment with integrating learning and play....

  10. A Model-driven Framework for Educational Game Design

    Directory of Open Access Journals (Sweden)

    Bill Roungas

    2016-09-01

    Full Text Available Educational games are a class of serious games whose main purpose is to teach some subject to their players. Despite the many existing design frameworks, these games are too often created in an ad-hoc manner, and typically without the use of a game design document (GDD. We argue that a reason for this phenomenon is that current ways to structure, create and update GDDs do not increase the value of the artifact in the design and development process. As a solution, we propose a model-driven, web-based knowledge management environment that supports game designers in the creation of a GDD that accounts for and relates educational and entertainment game elements. The foundation of our approach is our devised conceptual model for educational games, which also defines the structure of the design environment. We present promising results from an evaluation of our environment with eight experts in serious games.

  11. A Categorical Framework for Model Classification in the Geosciences

    Science.gov (United States)

    Hauhs, Michael; Trancón y Widemann, Baltasar; Lange, Holger

    2016-04-01

    Models have a mixed record of success in the geosciences. In meteorology, model development and implementation has been among the first and most successful examples of triggering computer technology in science. On the other hand, notorious problems such as the 'equifinality issue' in hydrology lead to a rather mixed reputation of models in other areas. The most successful models in geosciences are applications of dynamic systems theory to non-living systems or phenomena. Thus, we start from the hypothesis that the success of model applications relates to the influence of life on the phenomenon under study. We thus focus on the (formal) representation of life in models. The aim is to investigate whether disappointment in model performance is due to system properties such as heterogeneity and historicity of ecosystems, or rather reflects an abstraction and formalisation problem at a fundamental level. As a formal framework for this investigation, we use category theory as applied in computer science to specify behaviour at an interface. Its methods have been developed for translating and comparing formal structures among different application areas and seems highly suited for a classification of the current "model zoo" in the geosciences. The approach is rather abstract, with a high degree of generality but a low level of expressibility. Here, category theory will be employed to check the consistency of assumptions about life in different models. It will be shown that it is sufficient to distinguish just four logical cases to check for consistency of model content. All four cases can be formalised as variants of coalgebra-algebra homomorphisms. It can be demonstrated that transitions between the four variants affect the relevant observations (time series or spatial maps), the formalisms used (equations, decision trees) and the test criteria of success (prediction, classification) of the resulting model types. We will present examples from hydrology and ecology in

  12. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    Science.gov (United States)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2015-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a SimulinkR library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  13. 78 FR 32224 - Availability of Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional...

    Science.gov (United States)

    2013-05-29

    ... Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional Discussion Topics in Connect America Cost Model Virtual Workshop AGENCY: Federal Communications Commission. ACTION: Proposed rule... America Cost Model (CAM v3.1.2), which allows Commission staff and interested parties to calculate costs...

  14. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    Science.gov (United States)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications

  15. An Integrated Hydro-Economic Modelling Framework to Evaluate Water Allocation Strategies I: Model Development.

    NARCIS (Netherlands)

    George, B.; Malano, H.; Davidson, B.; Hellegers, P.; Bharati, L.; Sylvain, M.

    2011-01-01

    In this paper an integrated modelling framework for water resources planning and management that can be used to carry out an analysis of alternative policy scenarios for water allocation and use is described. The modelling approach is based on integrating a network allocation model (REALM) and a

  16. Formulation, construction and analysis of kinetic models of metabolism: A review of modelling frameworks

    DEFF Research Database (Denmark)

    Saa, Pedro A.; Nielsen, Lars K.

    2017-01-01

    Kinetic models are critical to predict the dynamic behaviour of metabolic networks. Mechanistic kinetic models for large networks remain uncommon due to the difficulty of fitting their parameters. Recent modelling frameworks promise new ways to overcome this obstacle while retaining predictive ca...

  17. Version 2.0 of the European Gas Model. Changes and their impact on the German gas sector

    International Nuclear Information System (INIS)

    Balmert, David; Petrov, Konstantin

    2015-01-01

    In January 2015 ACER, the European Agency for the Cooperation of Energy Regulators, presented an updated version of its target model for the inner-European natural gas market, also referred to as version 2.0 of the Gas Target Model. During 2014 the existing model, originally developed by the Council of European Energy Regulators (CEER) and launched in 2011, had been analysed, revised and updated in preparation of the new version. While it has few surprises to offer, the new Gas Target Model contains specifies and goes into greater detail on many elements of the original model. Some of the new content is highly relevant to the German gas sector, not least the deliberations on the current key issues, which are security of supply and the ability of the gas markets to function.

  18. Probabilistic Model for Integrated Assessment of the Behavior at the T.D.P. Version 2

    International Nuclear Information System (INIS)

    Hurtado, A.; Eguilior, S.; Recreo, F

    2015-01-01

    This report documents the completion of the first phase of the implementation of the methodology ABACO2G (Bayes Application to Geological Storage of CO2) and the final version of the ABACO2G probabilistic model for the injection phase before its future validation in the experimental field of the Technology Development Plant in Hontom (Burgos). The model, which is based on the determination of the probabilistic risk component of a geological storage of CO2 using the formalism of Bayesian networks and Monte Carlo probability yields quantitative probability functions of the total system CO2 storage and of each one of their subsystems (storage subsystem and the primary seal; secondary containment subsystem and dispersion subsystem or tertiary one); the implementation of the stochastic time evolution of the CO2 plume during the injection period, the stochastic time evolution of the drying front, the probabilistic evolution of the pressure front, decoupled from the CO2 plume progress front, and the implementation of submodels and leakage probability functions through major leakage risk elements (fractures / faults and wells / deep boreholes) which together define the space of events to estimate the risks associated with the CO2 geological storage system. The activities included in this report have been to replace the previous qualitative estimation submodels of former ABACO2G version developed during Phase I of the project ALM-10-017, by analytical, semi-analytical or numerical submodels for the main elements of risk (wells and fractures), to obtain an integrated probabilistic model of a CO2 storage complex in carbonate formations that meets the needs of the integrated behavior evaluation of the Technology Development Plant in Hontomín

  19. Hydrogeochemical evaluation for Simpevarp model version 1.2. Preliminary site description of the Simpevarp area

    International Nuclear Information System (INIS)

    Laaksoharju, Marcus

    2004-12-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Simpevarp and Forsmark, to determine their geological, hydrogeochemical and hydrogeological characteristics. Present work completed has resulted in Model version 1.2 which represents the second evaluation of the available Simpevarp groundwater analytical data collected up to April, 2004. The deepest fracture groundwater samples with sufficient analytical data reflected depths down to 1.7 km. Model version 1.2 focusses on geochemical and mixing processes affecting the groundwater composition in the uppermost part of the bedrock, down to repository levels, and eventually extending to 1000 m depth. The groundwater flow regimes at Laxemar/Simpevarp are considered local and extend down to depths of around 600-1000 m depending on local topography. The marked differences in the groundwater flow regimes between Laxemar and Simpevarp are reflected in the groundwater chemistry where four major hydrochemical groups of groundwaters (types A-D) have been identified: TYPE A: This type comprises dilute groundwaters ( 3 type present at shallow ( 300 m) levels at Simpevarp, and at even greater depths (approx. 1200 m) at Laxemar. At Simpevarp the groundwaters are mainly Na-Ca-Cl with increasingly enhanced Br and SO 4 with depth. At Laxemar they are mainly Ca-Na-Cl also with increasing enhancements of Br and SO 4 with depth. Main reactions involve ion exchange (Ca). At both sites a glacial component and a deep saline component are present. At Simpevarp the saline component may be potentially non marine and/or non-marine/old Littorina marine in origin; at Laxemar it is more likely to be non-marine in origin. TYPE D: This type comprises reducing highly saline groundwaters (> 20 000 mg/L Cl; to a maximum of ∼70 g/L TDS) and only has been identified at Laxemar at depths exceeding 1200 m. It is mainly Ca-Na-Cl with higher Br but lower SO 4 compared

  20. Modelling grain growth in the framework of Rational Extended Thermodynamics

    International Nuclear Information System (INIS)

    Kertsch, Lukas; Helm, Dirk

    2016-01-01

    Grain growth is a significant phenomenon for the thermomechanical processing of metals. Since the mobility of the grain boundaries is thermally activated and energy stored in the grain boundaries is released during their motion, a mutual interaction with the process conditions occurs. To model such phenomena, a thermodynamic framework for the representation of thermomechanical coupling phenomena in metals including a microstructure description is required. For this purpose, Rational Extended Thermodynamics appears to be a useful tool. We apply an entropy principle to derive a thermodynamically consistent model for grain coarsening due to the growth and shrinkage of individual grains. Despite the rather different approaches applied, we obtain a grain growth model which is similar to existing ones and can be regarded as a thermodynamic extension of that by Hillert (1965) to more general systems. To demonstrate the applicability of the model, we compare our simulation results to grain growth experiments in pure copper by different authors, which we are able to reproduce very accurately. Finally, we study the implications of the energy release due to grain growth on the energy balance. The present unified approach combining a microstructure description and continuum mechanics is ready to be further used to develop more elaborate material models for complex thermo-chemo-mechanical coupling phenomena. (paper)

  1. Modelling grain growth in the framework of Rational Extended Thermodynamics

    Science.gov (United States)

    Kertsch, Lukas; Helm, Dirk

    2016-05-01

    Grain growth is a significant phenomenon for the thermomechanical processing of metals. Since the mobility of the grain boundaries is thermally activated and energy stored in the grain boundaries is released during their motion, a mutual interaction with the process conditions occurs. To model such phenomena, a thermodynamic framework for the representation of thermomechanical coupling phenomena in metals including a microstructure description is required. For this purpose, Rational Extended Thermodynamics appears to be a useful tool. We apply an entropy principle to derive a thermodynamically consistent model for grain coarsening due to the growth and shrinkage of individual grains. Despite the rather different approaches applied, we obtain a grain growth model which is similar to existing ones and can be regarded as a thermodynamic extension of that by Hillert (1965) to more general systems. To demonstrate the applicability of the model, we compare our simulation results to grain growth experiments in pure copper by different authors, which we are able to reproduce very accurately. Finally, we study the implications of the energy release due to grain growth on the energy balance. The present unified approach combining a microstructure description and continuum mechanics is ready to be further used to develop more elaborate material models for complex thermo-chemo-mechanical coupling phenomena.

  2. Effects of environmental change on agriculture, nutrition and health: A framework with a focus on fruits and vegetables [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Hanna L. Tuomisto

    2017-10-01

    Full Text Available Environmental changes are likely to affect agricultural production over the next  decades. The interactions between environmental change, agricultural yields and crop quality, and the critical pathways to future diets and health outcomes are largely undefined. There are currently no quantitative models to test the impact of multiple environmental changes on nutrition and health outcomes. Using an interdisciplinary approach, we developed a framework to link the multiple interactions between environmental change, agricultural productivity and crop quality, population-level food availability, dietary intake and health outcomes, with a specific focus on fruits and vegetables. The main components of the framework consist of: i socio-economic and societal factors, ii environmental change stressors, iii interventions and policies, iv food system activities, v food and nutrition security, and vi health and well-being outcomes. The framework, based on currently available evidence, provides an overview of the multidimensional and complex interactions with feedback between environmental change, production of fruits and vegetables, diets and health, and forms the analytical basis for future modelling and scenario testing.

  3. A modelling framework to simulate foliar fungal epidemics using functional-structural plant models.

    Science.gov (United States)

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-09-01

    Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional-structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant-environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both

  4. Documentation for the MODFLOW 6 framework

    Science.gov (United States)

    Hughes, Joseph D.; Langevin, Christian D.; Banta, Edward R.

    2017-08-10

    MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. Growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Often times, there are incompatibilities between these different MODFLOW versions. The report describes a new MODFLOW framework called MODFLOW 6 that is designed to support multiple models and multiple types of models. The framework is written in Fortran using a modular object-oriented design. The primary framework components include the simulation (or main program), Timing Module, Solutions, Models, Exchanges, and Utilities. The first version of the framework focuses on numerical solutions, numerical models, and numerical exchanges. This focus on numerical models allows multiple numerical models to be tightly coupled at the matrix level.

  5. A description of the FAMOUS (version XDBUA climate model and control run

    Directory of Open Access Journals (Sweden)

    A. Osprey

    2008-12-01

    Full Text Available FAMOUS is an ocean-atmosphere general circulation model of low resolution, capable of simulating approximately 120 years of model climate per wallclock day using current high performance computing facilities. It uses most of the same code as HadCM3, a widely used climate model of higher resolution and computational cost, and has been tuned to reproduce the same climate reasonably well. FAMOUS is useful for climate simulations where the computational cost makes the application of HadCM3 unfeasible, either because of the length of simulation or the size of the ensemble desired. We document a number of scientific and technical improvements to the original version of FAMOUS. These improvements include changes to the parameterisations of ozone and sea-ice which alleviate a significant cold bias from high northern latitudes and the upper troposphere, and the elimination of volume-averaged drifts in ocean tracers. A simple model of the marine carbon cycle has also been included. A particular goal of FAMOUS is to conduct millennial-scale paleoclimate simulations of Quaternary ice ages; to this end, a number of useful changes to the model infrastructure have been made.

  6. Computer-aided modeling framework – a generic modeling template for catalytic membrane fixed bed reactors

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2013-01-01

    and users to generate and test models systematically, efficiently and reliably. In this way, development of products and processes can be faster, cheaper and very efficient. In this contribution, as part of the framework a generic modeling template for the systematic derivation of problem specific catalytic...... membrane fixed bed models is developed. The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene....

  7. Hydrogeochemical evaluation of the Forsmark site, model version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Laaksoharju, Marcus (ed.) [GeoPoint AB, Sollentuna (Sweden); Gimeno, Maria; Auque, Luis; Gomez, Javier [Univ. of Zaragoza (Spain). Dept. of Earth Sciences; Smellie, John [Conterra AB, Uppsala (Sweden); Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden); Gurban, Ioana [3D-Terra, Montreal (Canada)

    2004-01-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Forsmark and Simpevarp, on the eastern coast of Sweden to determine their geological, geochemical and hydrogeological characteristics. Present work completed has resulted in model version 1.1 which represents the first evaluation of the available Forsmark groundwater analytical data collected up to May 1, 2003 (i.e. the first 'data freeze'). The HAG group had access to a total of 456 water samples collected mostly from the surface and sub-surface environment (e.g. soil pipes in the overburden, streams and lakes); only a few samples were collected from drilled boreholes. The deepest samples reflected depths down to 200 m. Furthermore, most of the waters sampled (74%) lacked crucial analytical information that restricted the evaluation. Consequently, model version 1.1 focussed on the processes taking place in the uppermost part of the bedrock rather than at repository levels. The complex groundwater evolution and patterns at Forsmark are a result of many factors such as: a) the flat topography and closeness to the Baltic Sea resulting in relative small hydrogeological driving forces which can preserve old water types from being flushed out, b) the changes in hydrogeology related to glaciation/deglaciation and land uplift, c) repeated marine/lake water regressions/transgressions, and d) organic or inorganic alteration of the groundwater caused by microbial processes or water/rock interactions. The sampled groundwaters reflect to various degrees modern or ancient water/rock interactions and mixing processes. Based on the general geochemical character and the apparent age two major water types occur in Forsmark: fresh-meteoric waters with a bicarbonate imprint and low residence times (tritium values above detection limit), and brackish-marine waters with Cl contents up to 6,000 mg/L and longer residence times (tritium

  8. Thermal modelling. Preliminary site description Simpevarp subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2005-08-15

    This report presents the thermal site descriptive model for the Simpevarp subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at possible canister scale has been modelled for four different lithological domains (RSMA01 (Aevroe granite), RSMB01 (Fine-grained dioritoid), RSMC01 (mixture of Aevroe granite and Quartz monzodiorite), and RSMD01 (Quartz monzodiorite)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Three alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Simpevarp subarea, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. For one rock type, the Aevroe granite (501044), density loggings within the specific rock type has also been used in the domain modelling in order to consider the spatial variability within the Aevroe granite. This has been possible due to the presented relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the mean of thermal conductivity is expected to exhibit only a small variation between the different domains, from 2.62 W/(m.K) to 2.80 W/(m.K). The standard deviation varies according to the scale considered and for the canister scale it is expected to range from 0.20 to 0.28 W/(m.K). Consequently, the lower confidence limit (95% confidence) for the canister scale is within the range 2.04-2.35 W/(m.K) for the different domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-3.4% per 100 deg C increase in temperature for the dominating rock

  9. Thermal modelling. Preliminary site description Simpevarp subarea - version 1.2

    International Nuclear Information System (INIS)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta

    2005-08-01

    This report presents the thermal site descriptive model for the Simpevarp subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at possible canister scale has been modelled for four different lithological domains (RSMA01 (Aevroe granite), RSMB01 (Fine-grained dioritoid), RSMC01 (mixture of Aevroe granite and Quartz monzodiorite), and RSMD01 (Quartz monzodiorite)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Three alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Simpevarp subarea, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. For one rock type, the Aevroe granite (501044), density loggings within the specific rock type has also been used in the domain modelling in order to consider the spatial variability within the Aevroe granite. This has been possible due to the presented relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the mean of thermal conductivity is expected to exhibit only a small variation between the different domains, from 2.62 W/(m.K) to 2.80 W/(m.K). The standard deviation varies according to the scale considered and for the canister scale it is expected to range from 0.20 to 0.28 W/(m.K). Consequently, the lower confidence limit (95% confidence) for the canister scale is within the range 2.04-2.35 W/(m.K) for the different domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-3.4% per 100 deg C increase in temperature for the dominating rock

  10. Thermal modelling. Preliminary site description Laxemar subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Wrafter, John; Back, Paer-Erik; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2006-02-15

    This report presents the thermal site descriptive model for the Laxemar subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at canister scale has been modelled for five different lithological domains: RSMA (Aevroe granite), RSMBA (mixture of Aevroe granite and fine-grained dioritoid), RSMD (quartz monzodiorite), RSME (diorite/gabbro) and RSMM (mix domain with high frequency of diorite to gabbro). A base modelling approach has been used to determine the mean value of the thermal conductivity. Four alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological domain model for the Laxemar subarea, version 1.2 together with rock type models based on measured and calculated (from mineral composition) thermal conductivities. For one rock type, Aevroe granite (501044), density loggings have also been used in the domain modelling in order to evaluate the spatial variability within the Aevroe granite. This has been possible due to an established relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the means of thermal conductivity for the various domains are expected to exhibit a variation from 2.45 W/(m.K) to 2.87 W/(m.K). The standard deviation varies according to the scale considered, and for the 0.8 m scale it is expected to range from 0.17 to 0.29 W/(m.K). Estimates of lower tail percentiles for the same scale are presented for all five domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-5.3% per 100 deg C increase in temperature for the dominant rock types. There are a number of important uncertainties associated with these

  11. Solid waste projection model: Database user's guide (Version 1.0)

    International Nuclear Information System (INIS)

    Carr, F.; Stiles, D.

    1991-01-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for preparing to use Version 1 of the SWPM database, for entering and maintaining data, and for performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions, and does not provide instructions in the use of Paradox, the database management system in which the SWPM database is established. 3 figs., 1 tab

  12. Solid Waste Projection Model: Database user's guide (Version 1.3)

    International Nuclear Information System (INIS)

    Blackburn, C.L.

    1991-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for preparing to use Version 1.3 of the SWPM database, for entering and maintaining data, and for performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not provide instruction in the use of Paradox, the database management system in which the SWPM database is established

  13. HAM Construction modeling using COMSOL with MatLab Modeling Guide version 1.0.

    NARCIS (Netherlands)

    Schijndel, van A.W.M.

    2006-01-01

    This paper presents a first modeling guide for the modeling and simulation of up to full 3D dynamic Heat, Air & Moisture (HAM) transport of building constructions using COMSOL with Matlab. The modeling scripts are provided at the appendix. Furthermore, all modeling files and results are published at

  14. HAM Construction modeling using COMSOL with MatLab Modeling Guide, version 1.0

    NARCIS (Netherlands)

    Schijndel, van A.W.M.

    2006-01-01

    This paper presents a first modeling guide for the modeling and simulation of up to full 3D dynamic Heat, Air & Moisture (HAM) transport of building constructions using COMSOL with Matlab. The modeling scripts are provided at the appendix. Furthermore, all modeling files and results are published at

  15. TP-model transformation-based-control design frameworks

    CERN Document Server

    Baranyi, Péter

    2016-01-01

    This book covers new aspects and frameworks of control, design, and optimization based on the TP model transformation and its various extensions. The author outlines the three main steps of polytopic and LMI based control design: 1) development of the qLPV state-space model, 2) generation of the polytopic model; and 3) application of LMI to derive controller and observer. He goes on to describe why literature has extensively studied LMI design, but has not focused much on the second step, in part because the generation and manipulation of the polytopic form was not tractable in many cases. The author then shows how the TP model transformation facilitates this second step and hence reveals new directions, leading to powerful design procedures and the formulation of new questions. The chapters of this book, and the complex dynamical control tasks which they cover, are organized so as to present and analyze the beneficial aspect of the family of approaches (control, design, and optimization). Additionally, the b...

  16. A Learning Framework for Control-Oriented Modeling of Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.; Vishnu, Abhinav; Vrabie, Draguna L.

    2018-01-18

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and big data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.

  17. Testing a Conceptual Change Model Framework for Visual Data

    Science.gov (United States)

    Finson, Kevin D.; Pedersen, Jon E.

    2015-01-01

    An emergent data analysis technique was employed to test the veracity of a conceptual framework constructed around visual data use and instruction in science classrooms. The framework incorporated all five key components Vosniadou (2007a, 2007b) described as existing in a learner's schema: framework theory, presuppositions, conceptual domains,…

  18. Systems Biology Markup Language (SBML Level 2 Version 5: Structures and Facilities for Model Definitions

    Directory of Open Access Journals (Sweden)

    Hucka Michael

    2015-06-01

    Full Text Available Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  19. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    Science.gov (United States)

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.

  20. Overview of the Meso-NH model version 5.4 and its applications

    Directory of Open Access Journals (Sweden)

    C. Lac

    2018-05-01

    Full Text Available This paper presents the Meso-NH model version 5.4. Meso-NH is an atmospheric non hydrostatic research model that is applied to a broad range of resolutions, from synoptic to turbulent scales, and is designed for studies of physics and chemistry. It is a limited-area model employing advanced numerical techniques, including monotonic advection schemes for scalar transport and fourth-order centered or odd-order WENO advection schemes for momentum. The model includes state-of-the-art physics parameterization schemes that are important to represent convective-scale phenomena and turbulent eddies, as well as flows at larger scales. In addition, Meso-NH has been expanded to provide capabilities for a range of Earth system prediction applications such as chemistry and aerosols, electricity and lightning, hydrology, wildland fires, volcanic eruptions, and cyclones with ocean coupling. Here, we present the main innovations to the dynamics and physics of the code since the pioneer paper of Lafore et al. (1998 and provide an overview of recent applications and couplings.

  1. Conceptual Model of an Application for Automated Generation of Webpage Mobile Versions

    Directory of Open Access Journals (Sweden)

    Todor Rachovski

    2017-11-01

    Full Text Available Accessing webpages through various types of mobile devices with different screen sizes and using different browsers has put new demands on web developers. The main challenge is the development of websites with responsive design that is adaptable depending on the mobile device used. The article presents a conceptual model of an app for automated generation of mobile pages. It has five-layer architecture: database, database management layer, business logic layer, web services layer and a presentation layer. The database stores all the data needed to run the application. The database management layer uses an ORM model to convert relational data into an object-oriented format and control the access to them. The business logic layer contains components that perform the actual work on building a mobile version of the page, including parsing, building a hierarchical model of the page and a number of transformations. The web services layer provides external applications with access to lower-level functionalities, and the presentation layer is responsible for choosing and using the appropriate CSS. A web application that uses the proposed model was developed and experiments were conducted.

  2. Understanding Global Change: Frameworks and Models for Teaching Systems Thinking

    Science.gov (United States)

    Bean, J. R.; Mitchell, K.; Zoehfeld, K.; Oshry, A.; Menicucci, A. J.; White, L. D.; Marshall, C. R.

    2017-12-01

    The scientific and education communities must impart to teachers, students, and the public an understanding of how the various factors that drive climate and global change operate, and why the rates and magnitudes of these changes related to human perturbation of Earth system processes today are cause for deep concern. Even though effective educational modules explaining components of the Earth and climate system exist, interdisciplinary learning tools are necessary to conceptually link the causes and consequences of global changes. To address this issue, the Understanding Global Change Project at the University of California Museum of Paleontology (UCMP) at UC Berkeley developed an interdisciplinary framework that organizes global change topics into three categories: (1) causes of climate change, both human and non-human (e.g., burning of fossil fuels, deforestation, Earth's tilt and orbit), (2) Earth system processes that shape the way the Earth works (e.g., Earth's energy budget, water cycle), and (3) the measurable changes in the Earth system (e.g., temperature, precipitation, ocean acidification). To facilitate student learning about the Earth as a dynamic, interacting system, a website will provide visualizations of Earth system models and written descriptions of how each framework topic is conceptually linked to other components of the framework. These visualizations and textual summarizations of relationships and feedbacks in the Earth system are a unique and crucial contribution to science communication and education, informed by a team of interdisciplinary scientists and educators. The system models are also mechanisms by which scientists can communicate how their own work informs our understanding of the Earth system. Educators can provide context and relevancy for authentic datasets and concurrently can assess student understanding of the interconnectedness of global change phenomena. The UGC resources will be available through a web-based platform and

  3. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    International Nuclear Information System (INIS)

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the open-quotes constructionclose quotes of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc

  4. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  5. Simulated pre-industrial climate in Bergen Climate Model (version 2: model description and large-scale circulation features

    Directory of Open Access Journals (Sweden)

    O. H. Otterå

    2009-11-01

    Full Text Available The Bergen Climate Model (BCM is a fully-coupled atmosphere-ocean-sea-ice model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate. Here, a pre-industrial multi-century simulation with an updated version of BCM is described and compared to observational data. The model is run without any form of flux adjustments and is stable for several centuries. The simulated climate reproduces the general large-scale circulation in the atmosphere reasonably well, except for a positive bias in the high latitude sea level pressure distribution. Also, by introducing an updated turbulence scheme in the atmosphere model a persistent cold bias has been eliminated. For the ocean part, the model drifts in sea surface temperatures and salinities are considerably reduced compared to earlier versions of BCM. Improved conservation properties in the ocean model have contributed to this. Furthermore, by choosing a reference pressure at 2000 m and including thermobaric effects in the ocean model, a more realistic meridional overturning circulation is simulated in the Atlantic Ocean. The simulated sea-ice extent in the Northern Hemisphere is in general agreement with observational data except for summer where the extent is somewhat underestimated. In the Southern Hemisphere, large negative biases are found in the simulated sea-ice extent. This is partly related to problems with the mixed layer parametrization, causing the mixed layer in the Southern Ocean to be too deep, which in turn makes it hard to maintain a realistic sea-ice cover here. However, despite some problematic issues, the pre-industrial control simulation presented here should still be appropriate for climate change studies requiring multi-century simulations.

  6. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  7. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  8. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    Science.gov (United States)

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  9. A Production Model for Construction: A Theoretical Framework

    Directory of Open Access Journals (Sweden)

    Ricardo Antunes

    2015-03-01

    Full Text Available The building construction industry faces challenges, such as increasing project complexity and scope requirements, but shorter deadlines. Additionally, economic uncertainty and rising business competition with a subsequent decrease in profit margins for the industry demands the development of new approaches to construction management. However, the building construction sector relies on practices based on intuition and experience, overlooking the dynamics of its production system. Furthermore, researchers maintain that the construction industry has no history of the application of mathematical approaches to model and manage production. Much work has been carried out on how manufacturing practices apply to construction projects, mostly lean principles. Nevertheless, there has been little research to understand the fundamental mechanisms of production in construction. This study develops an in-depth literature review to examine the existing knowledge about production models and their characteristics in order to establish a foundation for dynamic production systems management in construction. As a result, a theoretical framework is proposed, which will be instrumental in the future development of mathematical production models aimed at predicting the performance and behaviour of dynamic project-based systems in construction.

  10. Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-11-30

    A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).

  11. Airline Sustainability Modeling: A New Framework with Application of Bayesian Structural Equation Modeling

    DEFF Research Database (Denmark)

    Salarzadeh Jenatabadi, Hashem; Babashamsi, Peyman; Khajeheian, Datis

    2016-01-01

    There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM) with maximum likelihood and Bayesian predictors. The introduced...

  12. VALIDATION OF THE ASTER GLOBAL DIGITAL ELEVATION MODEL VERSION 2 OVER THE CONTERMINOUS UNITED STATES

    Directory of Open Access Journals (Sweden)

    D. Gesch

    2012-07-01

    Full Text Available The ASTER Global Digital Elevation Model Version 2 (GDEM v2 was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1 in 2009. The absolute vertical accuracy of GDEM v2 was calculated by comparison with more than 18,000 independent reference geodetic ground control points from the National Geodetic Survey. The root mean square error (RMSE measured for GDEM v2 is 8.68 meters. This compares with the RMSE of 9.34 meters for GDEM v1. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v2 mean error of –0.20 meters is a significant improvement over the GDEM v1 mean error of –3.69 meters. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover to examine the effects of cover types on measured errors. The GDEM v2 mean errors by land cover class verify that the presence of aboveground features (tree canopies and built structures cause a positive elevation bias, as would be expected for an imaging system like ASTER. In open ground classes (little or no vegetation with significant aboveground height, GDEM v2 exhibits a negative bias on the order of 1 meter. GDEM v2 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM dataset. In many forested areas, GDEM v2 has elevations that are higher in the canopy than SRTM.

  13. Validation of the ASTER Global Digital Elevation Model Version 2 over the conterminous United States

    Science.gov (United States)

    Gesch, Dean B.; Oimoen, Michael J.; Zhang, Zheng; Meyer, David J.; Danielson, Jeffrey J.

    2012-01-01

    The ASTER Global Digital Elevation Model Version 2 (GDEM v2) was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1) in 2009. The absolute vertical accuracy of GDEM v2 was calculated by comparison with more than 18,000 independent reference geodetic ground control points from the National Geodetic Survey. The root mean square error (RMSE) measured for GDEM v2 is 8.68 meters. This compares with the RMSE of 9.34 meters for GDEM v1. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v2 mean error of -0.20 meters is a significant improvement over the GDEM v1 mean error of -3.69 meters. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover to examine the effects of cover types on measured errors. The GDEM v2 mean errors by land cover class verify that the presence of aboveground features (tree canopies and built structures) cause a positive elevation bias, as would be expected for an imaging system like ASTER. In open ground classes (little or no vegetation with significant aboveground height), GDEM v2 exhibits a negative bias on the order of 1 meter. GDEM v2 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM) dataset. In many forested areas, GDEM v2 has elevations that are higher in the canopy than SRTM.

  14. A generic method for automatic translation between input models for different versions of simulation codes

    International Nuclear Information System (INIS)

    Serfontein, Dawid E.; Mulder, Eben J.; Reitsma, Frederik

    2014-01-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications

  15. A generic method for automatic translation between input models for different versions of simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Serfontein, Dawid E., E-mail: Dawid.Serfontein@nwu.ac.za [School of Mechanical and Nuclear Engineering, North West University (PUK-Campus), PRIVATE BAG X6001 (Internal Post Box 360), Potchefstroom 2520 (South Africa); Mulder, Eben J. [School of Mechanical and Nuclear Engineering, North West University (South Africa); Reitsma, Frederik [Calvera Consultants (South Africa)

    2014-05-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications.

  16. RALOC Mod 1/81: Program description of RALOC version by the structural heat model HECU

    International Nuclear Information System (INIS)

    Pham, V.T.

    1984-01-01

    In the version RALOC-Mod 1/81 an expanded heat transfer model and structure heat model is included. This feature allows for a realistic simulation of the thermodynamic and fluiddynamic characteristics of the containment atmosphere. Steel and concrete substructures with a plain or rotational symmetry can be represented. The treat transfer calculations for the structures are problem oriented, taking into account, the time- and space dependencies. The influence of the heat transfer on the gas transport (in particular convection) in the reactor vessel is demonstrated by the numerical calculations. In contrast to the calculations without a simulation of the heat storage effects of the container structures showing a widely homogenious hydrogen distribution, the results on the basis of the HECU-model give an inhomogenious distribution during the first 8 to 12 days. However these results are only examples for the application of the RALOC-Mod 1/81 -code, which have not been intended to contribute to the discussion of hydrogen distributions in a PWR-type reactor. (orig./GL) [de

  17. Structure function of holographic quark-gluon plasma: Sakai-Sugimoto model versus its noncritical version

    International Nuclear Information System (INIS)

    Bu Yanyan; Yang Jinmin

    2011-01-01

    Motivated by recent studies of deep inelastic scattering off the N=4 super-Yang-Mills (SYM) plasma, holographically dual to an AdS 5 xS 5 black hole, we use the spacelike flavor current to probe the internal structure of one holographic quark-gluon plasma, which is described by the Sakai-Sugimoto model at high temperature phase (i.e., the chiral-symmetric phase). The plasma structure function is extracted from the retarded flavor current-current correlator. Our main aim in this paper is to explore the effect of nonconformality on these physical quantities. As usual, our study is under the supergravity approximation and the limit of large color number. Although the Sakai-Sugimoto model is nonconformal, which makes the calculations more involved than the well-studied N=4 SYM case, the result seems to indicate that the nonconformality has little essential effect on the physical picture of the internal structure of holographic plasma, which is consistent with the intuition from the asymptotic freedom of QCD at high energy. While the physical picture underlying our investigation is same as the deep inelastic scattering off the N=4 SYM plasma with(out) flavor, the plasma structure functions are quantitatively different, especially their scaling dependence on the temperature, which can be recognized as model dependent. As a comparison, we also do the same analysis for the noncritical version of the Sakai-Sugimoto model which is conformal in the sense that it has a constant dilaton vacuum. The result for this noncritical model is quite similar to the conformal N=4 SYM plasma. We therefore attribute the above difference to the effect of nonconformality of the Sakai-Sugimoto model.

  18. A Global Modeling Framework for Plasma Kinetics: Development and Applications

    Science.gov (United States)

    Parsey, Guy Morland

    The modern study of plasmas, and applications thereof, has developed synchronously with com- puter capabilities since the mid-1950s. Complexities inherent to these charged-particle, many- body, systems have resulted in the development of multiple simulation methods (particle-in-cell, fluid, global modeling, etc.) in order to both explain observed phenomena and predict outcomes of plasma applications. Recognizing that different algorithms are chosen to best address specific topics of interest, this thesis centers around the development of an open-source global model frame- work for the focused study of non-equilibrium plasma kinetics. After verification and validation of the framework, it was used to study two physical phenomena: plasma-assisted combustion and the recently proposed optically-pumped rare gas metastable laser. Global models permeate chemistry and plasma science, relying on spatial averaging to focus attention on the dynamics of reaction networks. Defined by a set of species continuity and energy conservation equations, the required data and constructed systems are conceptually similar across most applications, providing a light platform for exploratory and result-search parameter scan- ning. Unfortunately, it is common practice for custom code to be developed for each application-- an enormous duplication of effort which negatively affects the quality of the software produced. Presented herein, the Python-based Kinetic Global Modeling framework (KGMf) was designed to support all modeling phases: collection and analysis of reaction data, construction of an exportable system of model ODEs, and a platform for interactive evaluation and post-processing analysis. A symbolic ODE system is constructed for interactive manipulation and generation of a Jacobian, both of which are compiled as operation-optimized C-code. Plasma-assisted combustion and ignition (PAC/PAI) embody the modernization of burning fuel by opening up new avenues of control and optimization

  19. The Extrapolar SWIFT model (version 1.0): fast stratospheric ozone chemistry for global climate models

    Science.gov (United States)

    Kreyling, Daniel; Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2018-03-01

    The Extrapolar SWIFT model is a fast ozone chemistry scheme for interactive calculation of the extrapolar stratospheric ozone layer in coupled general circulation models (GCMs). In contrast to the widely used prescribed ozone, the SWIFT ozone layer interacts with the model dynamics and can respond to atmospheric variability or climatological trends.The Extrapolar SWIFT model employs a repro-modelling approach, in which algebraic functions are used to approximate the numerical output of a full stratospheric chemistry and transport model (ATLAS). The full model solves a coupled chemical differential equation system with 55 initial and boundary conditions (mixing ratio of various chemical species and atmospheric parameters). Hence the rate of change of ozone over 24 h is a function of 55 variables. Using covariances between these variables, we can find linear combinations in order to reduce the parameter space to the following nine basic variables: latitude, pressure altitude, temperature, overhead ozone column and the mixing ratio of ozone and of the ozone-depleting families (Cly, Bry, NOy and HOy). We will show that these nine variables are sufficient to characterize the rate of change of ozone. An automated procedure fits a polynomial function of fourth degree to the rate of change of ozone obtained from several simulations with the ATLAS model. One polynomial function is determined per month, which yields the rate of change of ozone over 24 h. A key aspect for the robustness of the Extrapolar SWIFT model is to include a wide range of stratospheric variability in the numerical output of the ATLAS model, also covering atmospheric states that will occur in a future climate (e.g. temperature and meridional circulation changes or reduction of stratospheric chlorine loading).For validation purposes, the Extrapolar SWIFT model has been integrated into the ATLAS model, replacing the full stratospheric chemistry scheme. Simulations with SWIFT in ATLAS have proven that the

  20. Ariadne version 4 - a program for simulation of QCD cascades implementing the colour dipole model

    International Nuclear Information System (INIS)

    Loennblad, L.

    1992-01-01

    The fourth version of the Ariadne program for generating QCD cascades in the colour dipole approximation is presented. The underlying physics issues are discussed and a manual for using the program is given together with a few sample programs. The major changes from previous versions are the introduction of photon radiation from quarks and inclusion of interfaces to the LEPTO and PYTHIA programs. (orig.)

  1. Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework

    Science.gov (United States)

    Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…

  2. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    Science.gov (United States)

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  3. A computer-aided framework for development, identification andmanagement of physiologically-based pharmacokinetic models

    DEFF Research Database (Denmark)

    Heitzig, Martina; Linninger, Andreas; Sin, Gürkan

    2014-01-01

    The objective of this work is the development of a generic computer-aided modelling framework to support the development of physiologically-based pharmacokinetic models thereby increasing the efficiency and quality of the modelling process. In particular, the framework systematizes the modelling...

  4. An Access Control Model for the Uniframe Framework

    National Research Council Canada - National Science Library

    Crespi, Alexander M

    2005-01-01

    ... security characteristics from the properties of individual components would aid in the creation of more secure systems In this thesis, a framework for characterizing the access control properties...

  5. A unified framework for benchmark dose estimation applied to mixed models and model averaging

    DEFF Research Database (Denmark)

    Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

    2013-01-01

    for hierarchical data structures, reflecting increasingly common types of assay data. We illustrate the usefulness of the methodology by means of a cytotoxicology example where the sensitivity of two types of assays are evaluated and compared. By means of a simulation study, we show that the proposed framework......This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...

  6. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  7. Energy Integration for 2050 - A Strategic Impact Model (2050 SIM), Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    2010-10-01

    The United States (U.S.) energy infrastructure is among the most reliable, accessible, and economic in the world. On the other hand, it is also excessively reliant on foreign energy sources, experiences high volatility in energy prices, does not always practice good stewardship of finite indigenous energy resources, and emits significant quantities of greenhouse gas. The U.S. Department of Energy is conducting research and development on advanced nuclear reactor concepts and technologies, including High Temperature Gas Reactor (HTGR) technologies, directed at helping the United States meet its current and future energy challenges. This report discusses the Draft Strategic Impact Model (SIM), an initial version of which was created during the later part of FY-2010. SIM was developed to analyze and depict the benefits of various energy sources in meeting the energy demand and to provide an overall system understanding of the tradeoffs between building and using HTGRs versus other existing technologies for providing energy (heat and electricity) to various energy-use sectors in the United States. This report also provides the assumptions used in the model, the rationale for the methodology, and the references for the source documentation and source data used in developing the SIM.

  8. Energy Integration for 2050 - A Strategic Impact Model (2050 SIM), Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    John Collins

    2011-09-01

    The United States (U.S.) energy infrastructure is among the most reliable, accessible, and economic in the world. On the other hand, it is also excessively reliant on foreign energy sources, experiences high volatility in energy prices, does not always practice good stewardship of finite indigenous energy resources, and emits significant quantities of greenhouse gas. The U.S. Department of Energy is conducting research and development on advanced nuclear reactor concepts and technologies, including High Temperature Gas Reactor (HTGR) technologies, directed at helping the United States meet its current and future energy challenges. This report discusses the Draft Strategic Impact Model (SIM), an initial version of which was created during the later part of FY-2010. SIM was developed to analyze and depict the benefits of various energy sources in meeting the energy demand and to provide an overall system understanding of the tradeoffs between building and using HTGRs versus other existing technologies for providing energy (heat and electricity) to various energy-use sectors in the United States. This report also provides the assumptions used in the model, the rationale for the methodology, and the references for the source documentation and source data used in developing the SIM.

  9. Planar version of the CPT-even gauge sector of the standard model extension

    International Nuclear Information System (INIS)

    Ferreira Junior, Manoel M.; Casana, Rodolfo; Gomes, Adalto Rodrigues; Carvalho, Eduardo S.

    2011-01-01

    The CPT-even abelian gauge sector of the Standard Model Extension is represented by the Maxwell term supplemented by (K F ) μνρσ F μν F ρσ , where the Lorentz-violating background tensor, (K F ) μνρσ , possesses the symmetries of the Riemann tensor and a double null trace, which renders nineteen independent components. From these ones, ten components yield birefringence while nine are nonbirefringent ones. In the present work, we examine the planar version of this theory, obtained by means of a typical dimensional reduction procedure to (1 + 2) dimensions. We obtain a kind of planar scalar electrodynamics, which is composed of a gauge sector containing six Lorentz-violating coefficients, a scalar field endowed with a noncanonical kinetic term, and a coupling term that links the scalar and gauge sectors. The dispersion relation is exactly determined, revealing that the six parameters related to the pure electromagnetic sector do not yield birefringence at any order. In this model, the birefringence may appear only as a second order effect associated with the coupling tensor linking the gauge and scalar sectors.The equations of motion are written and solved in the stationary regime. The Lorentz-violating parameters do not alter the asymptotic behavior of the fields but induce an angular dependence not observed in the Maxwell planar theory. The energy-momentum tensor was evaluated as well, revealing that the theory presents energy stability. (author)

  10. A multi-sectoral version of the Post-Keynesian growth model

    Directory of Open Access Journals (Sweden)

    Ricardo Azevedo Araujo

    2015-03-01

    Full Text Available Abstract With this inquiry, we seek to develop a disaggregated version of the post-Keynesian approach to economic growth, by showing that indeed it can be treated as a particular case of the Pasinettian model of structural change and economic expansion. By relying upon vertical integration it becomes possible to carry out the analysis initiated by Kaldor (1956 and Robinson (1956, 1962, and followed by Dutt (1984, Rowthorn (1982 and later Bhaduri and Marglin (1990 in a multi-sectoral model in which demand and productivity increase at different paces in each sector. By adopting this approach it is possible to show that the structural economic dynamics is conditioned not only to patterns of evolving demand and diffusion of technological progress but also to the distributive features of the economy, which can give rise to different regimes of economic growth. Besides, we find it possible to determine the natural rate of profit that makes the mark-up rate to be constant over time.

  11. Systems Security Engineering Capability Maturity Model (SSECMM), Model Description, Version 1.1

    National Research Council Canada - National Science Library

    1997-01-01

    This document is designed to acquaint the reader with the SSE-CMM Project as a whole and present the project's major work product - the Systems Security Engineering Capability Maturity Model (SSE- CMM...

  12. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  13. The Nexus Land-Use model version 1.0, an approach articulating biophysical potentials and economic dynamics to model competition for land-use

    Science.gov (United States)

    Souty, F.; Brunelle, T.; Dumas, P.; Dorin, B.; Ciais, P.; Crassous, R.; Müller, C.; Bondeau, A.

    2012-10-01

    Interactions between food demand, biomass energy and forest preservation are driving both food prices and land-use changes, regionally and globally. This study presents a new model called Nexus Land-Use version 1.0 which describes these interactions through a generic representation of agricultural intensification mechanisms within agricultural lands. The Nexus Land-Use model equations combine biophysics and economics into a single coherent framework to calculate crop yields, food prices, and resulting pasture and cropland areas within 12 regions inter-connected with each other by international trade. The representation of cropland and livestock production systems in each region relies on three components: (i) a biomass production function derived from the crop yield response function to inputs such as industrial fertilisers; (ii) a detailed representation of the livestock production system subdivided into an intensive and an extensive component, and (iii) a spatially explicit distribution of potential (maximal) crop yields prescribed from the Lund-Postdam-Jena global vegetation model for managed Land (LPJmL). The economic principles governing decisions about land-use and intensification are adapted from the Ricardian rent theory, assuming cost minimisation for farmers. In contrast to the other land-use models linking economy and biophysics, crops are aggregated as a representative product in calories and intensification for the representative crop is a non-linear function of chemical inputs. The model equations and parameter values are first described in details. Then, idealised scenarios exploring the impact of forest preservation policies or rising energy price on agricultural intensification are described, and their impacts on pasture and cropland areas are investigated.

  14. The Nexus Land-Use model version 1.0, an approach articulating biophysical potentials and economic dynamics to model competition for land-use

    Directory of Open Access Journals (Sweden)

    F. Souty

    2012-10-01

    Full Text Available Interactions between food demand, biomass energy and forest preservation are driving both food prices and land-use changes, regionally and globally. This study presents a new model called Nexus Land-Use version 1.0 which describes these interactions through a generic representation of agricultural intensification mechanisms within agricultural lands. The Nexus Land-Use model equations combine biophysics and economics into a single coherent framework to calculate crop yields, food prices, and resulting pasture and cropland areas within 12 regions inter-connected with each other by international trade. The representation of cropland and livestock production systems in each region relies on three components: (i a biomass production function derived from the crop yield response function to inputs such as industrial fertilisers; (ii a detailed representation of the livestock production system subdivided into an intensive and an extensive component, and (iii a spatially explicit distribution of potential (maximal crop yields prescribed from the Lund-Postdam-Jena global vegetation model for managed Land (LPJmL. The economic principles governing decisions about land-use and intensification are adapted from the Ricardian rent theory, assuming cost minimisation for farmers. In contrast to the other land-use models linking economy and biophysics, crops are aggregated as a representative product in calories and intensification for the representative crop is a non-linear function of chemical inputs. The model equations and parameter values are first described in details. Then, idealised scenarios exploring the impact of forest preservation policies or rising energy price on agricultural intensification are described, and their impacts on pasture and cropland areas are investigated.

  15. Hydrogeochemical evaluation for Simpevarp model version 1.2. Preliminary site description of the Simpevarp area

    Energy Technology Data Exchange (ETDEWEB)

    Laaksoharju, Marcus (ed.) [Geopoint AB, Stockholm (Sweden)

    2004-12-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Simpevarp and Forsmark, to determine their geological, hydrogeochemical and hydrogeological characteristics. Present work completed has resulted in Model version 1.2 which represents the second evaluation of the available Simpevarp groundwater analytical data collected up to April, 2004. The deepest fracture groundwater samples with sufficient analytical data reflected depths down to 1.7 km. Model version 1.2 focusses on geochemical and mixing processes affecting the groundwater composition in the uppermost part of the bedrock, down to repository levels, and eventually extending to 1000 m depth. The groundwater flow regimes at Laxemar/Simpevarp are considered local and extend down to depths of around 600-1000 m depending on local topography. The marked differences in the groundwater flow regimes between Laxemar and Simpevarp are reflected in the groundwater chemistry where four major hydrochemical groups of groundwaters (types A-D) have been identified: TYPE A: This type comprises dilute groundwaters (< 1000 mg/L Cl; 0.5-2.0 g/L TDS) of Na-HCO{sub 3} type present at shallow (<200 m) depths at Simpevarp, but at greater depths (0-900 m) at Laxemar. At both localities the groundwaters are marginally oxidising close to the surface, but otherwise reducing. Main reactions involve weathering, ion exchange (Ca, Mg), surface complexation, and dissolution of calcite. Redox reactions include precipitation of Fe-oxyhydroxides and some microbially mediated reactions (SRB). Meteoric recharge water is mainly present at Laxemar whilst at Simpevarp potential mixing of recharge meteoric water and a modern sea component is observed. Localised mixing of meteoric water with deeper saline groundwaters is indicated at both Laxemar and Simpevarp. TYPE B: This type comprises brackish groundwaters (1000-6000 mg/L Cl; 5-10 g/L TDS) present at

  16. Exploring Higher Education Governance: Analytical Models and Heuristic Frameworks

    Directory of Open Access Journals (Sweden)

    Burhan FINDIKLI

    2017-08-01

    Full Text Available Governance in higher education, both at institutional and systemic levels, has experienced substantial changes within recent decades because of a range of world-historical processes such as massification, growth, globalization, marketization, public sector reforms, and the emergence of knowledge economy and society. These developments have made governance arrangements and decision-making processes in higher education more complex and multidimensional more than ever and forced scholars to build new analytical and heuristic tools and strategies to grasp the intricacy and diversity of higher education governance dynamics. This article provides a systematic discussion of how and through which tools prominent scholars of higher education have analyzed governance in this sector by examining certain heuristic frameworks and analytical models. Additionally, the article shows how social scientific analysis of governance in higher education has proceeded in a cumulative way with certain revisions and syntheses rather than radical conceptual and theoretical ruptures from Burton R. Clark’s seminal work to the present, revealing conceptual and empirical junctures between them.

  17. A NetCDF version of the two-dimensional energy balance model based on the full multigrid algorithm

    Science.gov (United States)

    Zhuang, Kelin; North, Gerald R.; Stevens, Mark J.

    A NetCDF version of the two-dimensional energy balance model based on the full multigrid method in Fortran is introduced for both pedagogical and research purposes. Based on the land-sea-ice distribution, orbital elements, greenhouse gases concentration, and albedo, the code calculates the global seasonal surface temperature. A step-by-step guide with examples is provided for practice.

  18. RAMS Model for Terrestrial Pathways Version 3. 0 (for microcomputers). Model-Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Niebla, E.

    1989-01-01

    The RAMS Model for Terrestrial Pathways is a computer program for calculation of numeric criteria for land application and distribution and marketing of sludges under the sewage-sludge regulations at 40 CFR Part 503. The risk-assessment models covered assume that municipal sludge with specified characteristics is spread across a defined area of ground at a known rate once each year for a given number of years. Risks associated with direct land application of sludge applied after distribution and marketing are both calculated. The computer program calculates the maximum annual loading of contaminants that can be land applied and still meet the risk criteria specified as input. Software Description: The program is written in the Turbo/Basic programming language for implementation on IBM PC/AT or compatible machines using DOS 3.0 or higher operating system. Minimum core storage is 512K.

  19. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    Science.gov (United States)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  20. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    OpenAIRE

    Sitek, Paweł; Wikarek, Jarosław

    2016-01-01

    This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs) and constraint optimization problems (COPs). Two paradigms, CLP (constraint logic programming) and MP (mathematical programming), are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework a...

  1. ASTER Global Digital Elevation Model Version 2 - summary of validation results

    Science.gov (United States)

    Tachikawa, Tetushi; Kaku, Manabu; Iwasaki, Akira; Gesch, Dean B.; Oimoen, Michael J.; Zhang, Z.; Danielson, Jeffrey J.; Krieger, Tabatha; Curtis, Bill; Haase, Jeff; Abrams, Michael; Carabajal, C.; Meyer, Dave

    2011-01-01

    On June 29, 2009, NASA and the Ministry of Economy, Trade and Industry (METI) of Japan released a Global Digital Elevation Model (GDEM) to users worldwide at no charge as a contribution to the Global Earth Observing System of Systems (GEOSS). This “version 1” ASTER GDEM (GDEM1) was compiled from over 1.2 million scenebased DEMs covering land surfaces between 83°N and 83°S latitudes. A joint U.S.-Japan validation team assessed the accuracy of the GDEM1, augmented by a team of 20 cooperators. The GDEM1 was found to have an overall accuracy of around 20 meters at the 95% confidence level. The team also noted several artifacts associated with poor stereo coverage at high latitudes, cloud contamination, water masking issues and the stacking process used to produce the GDEM1 from individual scene-based DEMs (ASTER GDEM Validation Team, 2009). Two independent horizontal resolution studies estimated the effective spatial resolution of the GDEM1 to be on the order of 120 meters.

  2. Modelling Supported Driving as an Optimal Control Cycle : Framework and Model Characteristics

    NARCIS (Netherlands)

    Wang, M.; Treiber, M.; Daamen, W.; Hoogendoorn, S.P.; Van Arem, B.

    2013-01-01

    Driver assistance systems support drivers in operating vehicles in a safe, comfortable and efficient way, and thus may induce changes in traffic flow characteristics. This paper puts forward a receding horizon control framework to model driver assistance and cooperative systems. The accelerations of

  3. Atmospheric radionuclide transport model with radon postprocessor and SBG module. Model description version 2.8.0; ARTM. Atmosphaerisches Radionuklid-Transport-Modell mit Radon Postprozessor und SBG-Modul. Modellbeschreibung zu Version 2.8.0

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Cornelia; Sogalla, Martin; Thielen, Harald; Martens, Reinhard

    2015-04-20

    The study on the atmospheric radionuclide transport model with radon postprocessor and SBG module (model description version 2.8.0) covers the following issues: determination of emissions, radioactive decay, atmospheric dispersion calculation for radioactive gases, atmospheric dispersion calculation for radioactive dusts, determination of the gamma cloud radiation (gamma submersion), terrain roughness, effective source height, calculation area and model points, geographic reference systems and coordinate transformations, meteorological data, use of invalid meteorological data sets, consideration of statistical uncertainties, consideration of housings, consideration of bumpiness, consideration of terrain roughness, use of frequency distributions of the hourly dispersion situation, consideration of the vegetation period (summer), the radon post processor radon.exe, the SBG module, modeling of wind fields, shading settings.

  4. The BlueSky Smoke Modeling Framework: Recent Developments

    Science.gov (United States)

    Sullivan, D. C.; Larkin, N.; Raffuse, S. M.; Strand, T.; ONeill, S. M.; Leung, F. T.; Qu, J. J.; Hao, X.

    2012-12-01

    (TRMM) Multi-satellite Precipitation Analysis Real-Time (TMPA-RT) data set is being used to improve dead fuel moisture estimates. - EastFire live fuel moisture estimates, which are derived from NASA's MODIS direct broadcast, are being used to improve live fuel moisture estimates. - NASA's Multi-angle Imaging Spectroradiometer (MISR) stereo heights are being used to improve estimates of plume injection heights. Further, the Fire Location and Modeling of Burning Emissions (FLAMBÉ) model was incorporated into the BlueSky Framework as an alternative means of calculating fire emissions. FLAMBÉ directly estimates emissions on the basis of fire detections and radiance measures from NASA's MODIS and NOAA's GOES satellites. (The authors gratefully acknowledge NASA's Applied Sciences Program [Grant Nos. NN506AB52A and NNX09AV76G)], the USDA Forest Service, and the Joint Fire Science Program for their support.)

  5. A device model framework for magnetoresistive sensors based on the Stoner–Wohlfarth model

    International Nuclear Information System (INIS)

    Bruckner, Florian; Bergmair, Bernhard; Brueckl, Hubert; Palmesi, Pietro; Buder, Anton; Satz, Armin; Suess, Dieter

    2015-01-01

    The Stoner–Wohlfarth (SW) model provides an efficient analytical model to describe the behavior of magnetic layers within magnetoresistive sensors. Combined with a proper description of magneto-resistivity an efficient device model can be derived, which is necessary for an optimal electric circuit design. Parameters of the model are determined by global optimization of an application specific cost function which contains measured resistances for different applied fields. Several application cases are examined and used for validation of the device model. - Highlights: • An efficient device model framework for various types of magnetoresistive sensors is presented. • The model is based on the analytical solution of the Stoner–Wohlfarth model. • Numerical optimization methods provide optimal model parameters for a different application cases. • The model is applied to several application cases and is able to reproduce measured hysteresis and swiching behavior

  6. GLOFRIM v1.0-A globally applicable computational framework for integrated hydrological-hydrodynamic modelling

    NARCIS (Netherlands)

    Hoch, Jannis M.; Neal, Jeffrey C.; Baart, Fedor; Van Beek, Rens; Winsemius, Hessel C.; Bates, Paul D.; Bierkens, Marc F.P.

    2017-01-01

    We here present GLOFRIM, a globally applicable computational framework for integrated hydrological-hydrodynamic modelling. GLOFRIM facilitates spatially explicit coupling of hydrodynamic and hydrologic models and caters for an ensemble of models to be coupled. It currently encompasses the global

  7. Managing uncertainty in integrated environmental modelling: The UncertWeb framework.

    NARCIS (Netherlands)

    Bastin, L.; Cornford, D.; Jones, R.; Heuvelink, G.B.M.; Pebesma, E.; Stasch, C.; Nativi, S.; Mazzetti, P.

    2013-01-01

    Web-based distributed modelling architectures are gaining increasing recognition as potentially useful tools to build holistic environmental models, combining individual components in complex workflows. However, existing web-based modelling frameworks currently offer no support for managing

  8. A unified effective-field renormalization-group framework approach for the quenched diluted Ising models

    Science.gov (United States)

    de Albuquerque, Douglas F.; Fittipaldi, I. P.

    1994-05-01

    A unified effective-field renormalization-group framework (EFRG) for both quenched bond- and site-diluted Ising models is herein developed by extending recent works. The method, as in the previous works, follows up the same strategy of the mean-field renormalization-group scheme (MFRG), and is achieved by introducing an alternative way for constructing classical effective-field equations of state, based on rigorous Ising spin identities. The concentration dependence of the critical temperature, Tc(p), and the critical concentrations of magnetic atoms, pc, at which the transition temperature goes to zero, are evaluated for several two- and three-dimensional lattice structures. The obtained values of Tc and pc and the resulting phase diagrams for both bond and site cases are much more accurate than those estimated by the standard MFRG approach. Although preserving the same level of simplicity as the MFRG, it is shown that the present EFRG method, even by considering its simplest size-cluster version, provides results that correctly distinguishes those lattices that have the same coordination number, but differ in dimensionality or geometry.

  9. Landscape Environmental Assessment Framework

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-20

    LEAF Version 2.0 is a framework comprising of three models RUSLE2, WEPS, and AGNPS. The framework can predict row crop, crop residue, and energy crop yields at a sub-field resolutions for various combinations of soil, climate and crop management and residue harvesting practices. It estimates the loss of soil, carbon, and nutrients to the atmosphere, to the groundwater, and to runoff. It also models the overland flow of water and washed-off sediments, nutrients and other chemicals to provide estimates of sediment, nutrient, and chemical loadings to water bodies within a watershed. AGNPS model and wash-off calculations are the new additions to this version of LEAF. Development of LEAF software is supported by DOE's BETO program.

  10. D Geological Framework Models as a Teaching Aid for Geoscience

    Science.gov (United States)

    Kessler, H.; Ward, E.; Geological ModelsTeaching Project Team

    2010-12-01

    3D geological models have great potential as a resource for universities when teaching foundation geological concepts as it allows the student to visualise and interrogate UK geology. They are especially useful when dealing with the conversion of 2D field, map and GIS outputs into three dimensional geological units, which is a common problem for all students of geology. Today’s earth science students use a variety of skills and processes during their learning experience including the application of schema’s, spatial thinking, image construction, detecting patterns, memorising figures, mental manipulation and interpretation, making predictions and deducing the orientation of themselves and the rocks. 3D geological models can reinforce spatial thinking strategies and encourage students to think about processes and properties, in turn helping the student to recognise pre-learnt geological principles in the field and to convert what they see at the surface into a picture of what is going on at depth. Learning issues faced by students may also be encountered by experts, policy managers, and stakeholders when dealing with environmental problems. Therefore educational research of student learning in earth science may also improve environmental decision making. 3D geological framework models enhance the learning of Geosciences because they: ● enable a student to observe, manipulate and interpret geology; in particular the models instantly convert two-dimensional geology (maps, boreholes and cross-sections) into three dimensions which is a notoriously difficult geospatial skill to acquire. ● can be orientated to whatever the user finds comfortable and most aids recognition and interpretation. ● can be used either to teach geosciences to complete beginners or add to experienced students body of knowledge (whatever point that may be at). Models could therefore be packaged as a complete educational journey or students and tutor can select certain areas of the model

  11. Hydrogeochemical evaluation of the Simpevarp area, model version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Laaksoharju, Marcus (ed.) [Geopoint AB, Stockholm (Sweden); Smellie, John [Conterra AB, Uppsala (Sweden); Gimeno, Maria; Auque, Luis; Gomez, Javier [Univ. of Zaragoza (Spain). Dept. of Earth Sciences; Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden); Gurban, Ioana [3D-Terra (Sweden)

    2004-02-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Simpevarp and Forsmark, on the eastern coast of Sweden to determine their geological, hydrogeochemical and hydrogeological characteristics. Present work completed has resulted in model version 1.1 which represents the first evaluation of the available Simpevarp groundwater analytical data collected up to July 1st, 2003 (i.e. the first 'data freeze' of the site). The HAG (Hydrochemical Analytical Group) group had access to a total of 535 water samples collected from the surface and sub-surface environment (e.g. soil pipes in the overburden, streams and lakes); only a few samples were collected from drilled boreholes. The deepest fracture groundwater samples with sufficient analytical data reflected depths down to 250 m. Furthermore, most of the waters sampled (79%) lacked crucial analytical information that restricted the evaluation. Consequently, model version 1.1 focussed on the processes taking place in the uppermost part of the bedrock rather than at repository levels. The complex groundwater evolution and patterns at Simpevarp are a result of many factors such as: a) the flat topography and proximity to the Baltic Sea, b) changes in hydrogeology related to glaciation/deglaciation and land uplift, c) repeated marine/lake water regressions/transgressions, and d) organic or inorganic alteration of the groundwater composition caused by microbial processes or water/rock interactions. The sampled groundwaters reflect to various degrees of modern or ancient water/rock interactions and mixing processes. Higher topography to the west of Simpevarp has resulted in hydraulic gradients which have partially flushed out old water types. Except for sea waters, most surface waters and some groundwaters from percussion boreholes are fresh, non-saline waters according to the classification used for Aespoe groundwaters. The rest

  12. Airline Sustainability Modeling: A New Framework with Application of Bayesian Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Hashem Salarzadeh Jenatabadi

    2016-11-01

    Full Text Available There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM with maximum likelihood and Bayesian predictors. The introduced framework includes economic performance, operational performance, cost performance, and financial performance. Based on both Bayesian SEM (Bayesian-SEM and Classical SEM (Classical-SEM, it was found that economic performance with both operational performance and cost performance are significantly related to the financial performance index. The four mathematical indices employed are root mean square error, coefficient of determination, mean absolute error, and mean absolute percentage error to compare the efficiency of Bayesian-SEM and Classical-SEM in predicting the airline financial performance. The outputs confirmed that the framework with Bayesian prediction delivered a good fit with the data, although the framework predicted with a Classical-SEM approach did not prepare a well-fitting model. The reasons for this discrepancy between Classical and Bayesian predictions, as well as the potential advantages and caveats with the application of Bayesian approach in airline sustainability studies, are debated.

  13. User's guide to the MESOI diffusion model: Version 1.1 (for Data General Eclipse S/230 with AFOS)

    International Nuclear Information System (INIS)

    Athey, G.F.; Ramsdell, J.V.

    1982-09-01

    MESOI is an interactive, Langrangian puff trajectory model. The model theory is documented separately (Ramsdell and Athey, 1981). Version 1.1 is a modified form of the original 1.0. It is designed to run on a Data General Eclipse computer. The model has improved support features which make it useful as an emergency response tool. This report is intended to provide the user with the information necessary to successfully conduct model simulations using MESOI Version 1.1 and to use the support programs STAPREP and EXPLT. The user is also provided information on the use of the data file maintenance and review program UPDATE. Examples are given for the operation of the program. Test data sets are described which allow the user to practice with the programs and to confirm proper implementation and execution

  14. GENII Version 2 Users’ Guide

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.

    2004-03-08

    The GENII Version 2 computer code was developed for the Environmental Protection Agency (EPA) at Pacific Northwest National Laboratory (PNNL) to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) and the radiological risk estimating procedures of Federal Guidance Report 13 into updated versions of existing environmental pathway analysis models. The resulting environmental dosimetry computer codes are compiled in the GENII Environmental Dosimetry System. The GENII system was developed to provide a state-of-the-art, technically peer-reviewed, documented set of programs for calculating radiation dose and risk from radionuclides released to the environment. The codes were designed with the flexibility to accommodate input parameters for a wide variety of generic sites. Operation of a new version of the codes, GENII Version 2, is described in this report. Two versions of the GENII Version 2 code system are available, a full-featured version and a version specifically designed for demonstrating compliance with the dose limits specified in 40 CFR 61.93(a), the National Emission Standards for Hazardous Air Pollutants (NESHAPS) for radionuclides. The only differences lie in the limitation of the capabilities of the user to change specific parameters in the NESHAPS version. This report describes the data entry, accomplished via interactive, menu-driven user interfaces. Default exposure and consumption parameters are provided for both the average (population) and maximum individual; however, these may be modified by the user. Source term information may be entered as radionuclide release quantities for transport scenarios, or as basic radionuclide concentrations in environmental media (air, water, soil). For input of basic or derived concentrations, decay of parent radionuclides and ingrowth of radioactive decay products prior to the start of the exposure scenario may be considered. A single code run can

  15. Water, Energy, and Biogeochemical Model (WEBMOD), user’s manual, version 1

    Science.gov (United States)

    Webb, Richard M.T.; Parkhurst, David L.

    2017-02-08

    The Water, Energy, and Biogeochemical Model (WEBMOD) uses the framework of the U.S. Geological Survey (USGS) Modular Modeling System to simulate fluxes of water and solutes through watersheds. WEBMOD divides watersheds into model response units (MRU) where fluxes and reactions are simulated for the following eight hillslope reservoir types: canopy; snowpack; ponding on impervious surfaces; O-horizon; two reservoirs in the unsaturated zone, which represent preferential flow and matrix flow; and two reservoirs in the saturated zone, which also represent preferential flow and matrix flow. The reservoir representing ponding on impervious surfaces, currently not functional (2016), will be implemented once the model is applied to urban areas. MRUs discharge to one or more stream reservoirs that flow to the outlet of the watershed. Hydrologic fluxes in the watershed are simulated by modules derived from the USGS Precipitation Runoff Modeling System; the National Weather Service Hydro-17 snow model; and a topography-driven hydrologic model (TOPMODEL). Modifications to the standard TOPMODEL include the addition of heterogeneous vertical infiltration rates; irrigation; lateral and vertical preferential flows through the unsaturated zone; pipe flow draining the saturated zone; gains and losses to regional aquifer systems; and the option to simulate baseflow discharge by using an exponential, parabolic, or linear decrease in transmissivity. PHREEQC, an aqueous geochemical model, is incorporated to simulate chemical reactions as waters evaporate, mix, and react within the various reservoirs of the model. The reactions that can be specified for a reservoir include equilibrium reactions among water; minerals; surfaces; exchangers; and kinetic reactions such as kinetic mineral dissolution or precipitation, biologically mediated reactions, and radioactive decay. WEBMOD also simulates variations in the concentrations of the stable isotopes deuterium and oxygen-18 as a result of

  16. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development.

    Science.gov (United States)

    Tøndel, Kristin; Niederer, Steven A; Land, Sander; Smith, Nicolas P

    2014-05-20

    Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input-output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on

  17. Theories, models and frameworks used in capacity building interventions relevant to public health: a systematic review.

    Science.gov (United States)

    Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather

    2017-11-28

    There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners

  18. Modeling sports highlights using a time-series clustering framework and model interpretation

    Science.gov (United States)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2005-01-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  19. Single-Column Modeling of Convection During the CINDY2011/DYNAMO Field Campaign With the CNRM Climate Model Version 6

    Science.gov (United States)

    Abdel-Lathif, Ahmat Younous; Roehrig, Romain; Beau, Isabelle; Douville, Hervé

    2018-03-01

    A single-column model (SCM) approach is used to assess the CNRM climate model (CNRM-CM) version 6 ability to represent the properties of the apparent heat source (Q1) and moisture sink (Q2) as observed during the 3 month CINDY2011/DYNAMO field campaign, over its Northern Sounding Array (NSA). The performance of the CNRM SCM is evaluated in a constrained configuration in which the latent and sensible heat surface fluxes are prescribed, as, when forced by observed sea surface temperature, the model is strongly limited by the underestimate of the surface fluxes, most probably related to the SCM forcing itself. The model exhibits a significant cold bias in the upper troposphere, near 200 hPa, and strong wet biases close to the surface and above 700 hPa. The analysis of the Q1 and Q2 profile distributions emphasizes the properties of the convective parameterization of the CNRM-CM physics. The distribution of the Q2 profile is particularly challenging. The model strongly underestimates the frequency of occurrence of the deep moistening profiles, which likely involve misrepresentation of the shallow and congestus convection. Finally, a statistical approach is used to objectively define atmospheric regimes and construct a typical convection life cycle. A composite analysis shows that the CNRM SCM captures the general transition from bottom-heavy to mid-heavy to top-heavy convective heating. Some model errors are shown to be related to the stratiform regimes. The moistening observed during the shallow and congestus convection regimes also requires further improvements of this CNRM-CM physics.

  20. Designing a framework to design a business model for the 'bottom of the pyramid' population

    NARCIS (Netherlands)

    Ver loren van Themaat, Tanye; Schutte, Cornelius S.L.; Lutters, Diederick

    2013-01-01

    This article presents a framework for developing and designing a business model to target the bottom of the pyramid (BoP) population. Using blue ocean strategy and business model literature, integrated with research on the BoP, the framework offers a systematic approach for organisations to analyse

  1. Professional Development Recognizing Technology Integration Modeled after the TPACK Framework

    Science.gov (United States)

    McCusker, Laura

    2017-01-01

    Public school teachers within a Pennsylvania intermediate unit are receiving inadequate job-embedded professional development that recognizes knowledge of content, pedagogy, and technology integration, as outlined by Mishra and Koehler's Technological Pedagogical Content Knowledge (TPACK) framework (2006). A school environment where teachers are…

  2. Modeling CANDU type fuel behaviour during extended burnup irradiations using a revised version of the ELESIM code

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Richmond, W.R.

    1992-05-01

    The high-burnup database for CANDU fuel, with a variety of cases, offers a good opportunity to check models of fuel behaviour, and to identify areas for improvement. Good agreement of calculated values of fission-gas release, and sheath hoop strain, with experimental data indicates that the global behaviour of the fuel element is adequately simulated by a computer code. Using, the ELESIM computer code, the fission-gas release, swelling, and fuel pellet expansion models were analysed, and changes made for gaseous swelling, and diffusional release of fission-gas atoms to the grain boundaries. Using this revised version of ELESIM, satisfactory agreement between measured values of fission-gas release was found for most of the high-burnup database cases. It is concluded that the revised version of the ELESIM code is able to simulate with reasonable accuracy high-burnup as well as low-burnup CANDU fuel

  3. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    Science.gov (United States)

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  4. A NetCDF version of the two-dimensional energy balance model based on the full multigrid algorithm

    Directory of Open Access Journals (Sweden)

    Kelin Zhuang

    2017-01-01

    Full Text Available A NetCDF version of the two-dimensional energy balance model based on the full multigrid method in Fortran is introduced for both pedagogical and research purposes. Based on the land–sea–ice distribution, orbital elements, greenhouse gases concentration, and albedo, the code calculates the global seasonal surface temperature. A step-by-step guide with examples is provided for practice.

  5. Programs OPTMAN and SHEMMAN Version 6 (1999) - Coupled-Channels optical model and collective nuclear structure calculation -

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Jong Hwa; Lee, Jeong Yeon; Lee, Young Ouk; Sukhovitski, Efrem Sh [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-01-01

    Programs SHEMMAN and OPTMAN (Version 6) have been developed for determinations of nuclear Hamiltonian parameters and for optical model calculations, respectively. The optical model calculations by OPTMAN with coupling schemes built on wave functions functions of non-axial soft-rotator are self-consistent, since the parameters of the nuclear Hamiltonian are determined by adjusting the energies of collective levels to experimental values with SHEMMAN prior to the optical model calculation. The programs have been installed at Nuclear Data Evaluation Laboratory of KAERI. This report is intended as a brief manual of these codes. 43 refs., 9 figs., 1 tabs. (Author)

  6. Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Final Report, Version 2)

    Science.gov (United States)

    EPA's announced the availability of the final report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2). This update furthered land change modeling by providing nationwide housing developmen...

  7. Globally COnstrained Local Function Approximation via Hierarchical Modelling, a Framework for System Modelling under Partial Information

    DEFF Research Database (Denmark)

    Øjelund, Henrik; Sadegh, Payman

    2000-01-01

    be obtained. This paper presents a new approach for system modelling under partial (global) information (or the so called Gray-box modelling) that seeks to perserve the benefits of the global as well as local methodologies sithin a unified framework. While the proposed technique relies on local approximations......Local function approximations concern fitting low order models to weighted data in neighbourhoods of the points where the approximations are desired. Despite their generality and convenience of use, local models typically suffer, among others, from difficulties arising in physical interpretation...... simultaneously with the (local estimates of) function values. The approach is applied to modelling of a linear time variant dynamic system under prior linear time invariant structure where local regression fails as a result of high dimensionality....

  8. A Merging Framework for Rainfall Estimation at High Spatiotemporal Resolution for Distributed Hydrological Modeling in a Data-Scarce Area

    Directory of Open Access Journals (Sweden)

    Yinping Long

    2016-07-01

    Full Text Available Merging satellite and rain gauge data by combining accurate quantitative rainfall from stations with spatial continuous information from remote sensing observations provides a practical method of estimating rainfall. However, generating high spatiotemporal rainfall fields for catchment-distributed hydrological modeling is a problem when only a sparse rain gauge network and coarse spatial resolution of satellite data are available. The objective of the study is to present a satellite and rain gauge data-merging framework adapting for coarse resolution and data-sparse designs. In the framework, a statistical spatial downscaling method based on the relationships among precipitation, topographical features, and weather conditions was used to downscale the 0.25° daily rainfall field derived from the Tropical Rainfall Measuring Mission (TRMM Multisatellite Precipitation Analysis (TMPA precipitation product version 7. The nonparametric merging technique of double kernel smoothing, adapting for data-sparse design, was combined with the global optimization method of shuffled complex evolution, to merge the downscaled TRMM and gauged rainfall with minimum cross-validation error. An indicator field representing the presence and absence of rainfall was generated using the indicator kriging technique and applied to the previously merged result to consider the spatial intermittency of daily rainfall. The framework was applied to estimate daily precipitation at a 1 km resolution in the Qinghai Lake Basin, a data-scarce area in the northeast of the Qinghai-Tibet Plateau. The final estimates not only captured the spatial pattern of daily and annual precipitation with a relatively small estimation error, but also performed very well in stream flow simulation when applied to force the geomorphology-based hydrological model (GBHM. The proposed framework thus appears feasible for rainfall estimation at high spatiotemporal resolution in data-scarce areas.

  9. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    Science.gov (United States)

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  10. U.S. Nuclear Regulatory Commission Extremely Low Probability of Rupture pilot study : xLPR framework model user's guide.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinich, Donald A.; Sallaberry, Cedric M.; Mattie, Patrick D.

    2010-12-01

    For the U.S. Nuclear Regulatory Commission (NRC) Extremely Low Probability of Rupture (xLPR) pilot study, Sandia National Laboratories (SNL) was tasked to develop and evaluate a probabilistic framework using a commercial software package for Version 1.0 of the xLPR Code. Version 1.0 of the xLPR code is focused assessing the probability of rupture due to primary water stress corrosion cracking in dissimilar metal welds in pressurizer surge nozzles. Future versions of this framework will expand the capabilities to other cracking mechanisms, and other piping systems for both pressurized water reactors and boiling water reactors. The goal of the pilot study project is to plan the xLPR framework transition from Version 1.0 to Version 2.0; hence the initial Version 1.0 framework and code development will be used to define the requirements for Version 2.0. The software documented in this report has been developed and tested solely for this purpose. This framework and demonstration problem will be used to evaluate the commercial software's capabilities and applicability for use in creating the final version of the xLPR framework. This report details the design, system requirements, and the steps necessary to use the commercial-code based xLPR framework developed by SNL.

  11. SHADOW3: a new version of the synchrotron X-ray optics modelling package

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez del Rio, Manuel, E-mail: srio@esrf.eu [European Synchrotron Radiation Facility, 6 Jules Horowitz, 38000 Grenoble (France); Canestrari, Niccolo [CNRS, Grenoble (France); European Synchrotron Radiation Facility, 6 Jules Horowitz, 38000 Grenoble (France); Jiang, Fan; Cerrina, Franco [Boston University, 8 St Mary’s Street, Boston, MA 02215 (United States)

    2011-09-01

    SHADOW3, a new version of the X-ray tracing code SHADOW, is introduced. A new version of the popular X-ray tracing code SHADOW is presented. An important step has been made in restructuring the code following new computer engineering standards, ending with a modular Fortran 2003 structure and an application programming interface (API). The new code has been designed to be compatible with the original file-oriented SHADOW philosophy, but simplifying the compilation, installation and use. In addition, users can now become programmers using the newly designed SHADOW3 API for creating scripts, macros and programs; being able to deal with optical system optimization, image simulation, and also low transmission calculations requiring a large number of rays (>10{sup 6}). Plans for future development and questions on how to accomplish them are also discussed.

  12. SHADOW3: a new version of the synchrotron X-ray optics modelling package

    International Nuclear Information System (INIS)

    Sanchez del Rio, Manuel; Canestrari, Niccolo; Jiang, Fan; Cerrina, Franco

    2011-01-01

    SHADOW3, a new version of the X-ray tracing code SHADOW, is introduced. A new version of the popular X-ray tracing code SHADOW is presented. An important step has been made in restructuring the code following new computer engineering standards, ending with a modular Fortran 2003 structure and an application programming interface (API). The new code has been designed to be compatible with the original file-oriented SHADOW philosophy, but simplifying the compilation, installation and use. In addition, users can now become programmers using the newly designed SHADOW3 API for creating scripts, macros and programs; being able to deal with optical system optimization, image simulation, and also low transmission calculations requiring a large number of rays (>10 6 ). Plans for future development and questions on how to accomplish them are also discussed

  13. Strategic assessment of capacity consumption in railway networks: Framework and model

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex; Nielsen, Otto Anker

    2017-01-01

    In this paper, we develop a new framework for strategic planning purposes to calculate railway infrastructure occupation and capacity consumption in networks, independent of a timetable. Furthermore, a model implementing the framework is presented. In this model different train sequences...... are obtained efficiently with little input. The case illustrates the model's ability to quantify the capacity gain from infrastructure scenario to infrastructure scenario which can be used to increase the number of trains or improve the robustness of the system....

  14. A Framework Proposal For Choosing A New Business Implementation Model In Henkel

    OpenAIRE

    Li, Tsz Wan

    2015-01-01

    Henkel's New Business team is a corporate venturing unit that explores corporate entrepreneurial activities on behalf of Henkel Adhesives Technologies. The new business ideas are implemented through one of these models: incubator, venturing or innovation ecosystem. In current practice, there is no systematic framework in place to choose the implementation model. The goal of the thesis is to propose a framework for choosing the most appropriate model for implementation of a new business idea i...

  15. Framework for product knowledge and product related knowledge which supports product modelling for mass customization

    DEFF Research Database (Denmark)

    Riis, Jesper; Hansen, Benjamin Loer; Hvam, Lars

    2003-01-01

    on experience from product modelling projects in several companies. Among them for example companies manufacturing electronic switchboards, spray dryer systems and air conditioning equipment. The framework is divided into three views: the product knowledge view, the life phase system view and the transformation...... and personalization. The framework for product knowledge and product related knowledge is based on the following theories: axiomatic design, technical systems, theory of domains, theory of structuring, theory of properties and the framework for the content of product and product related models. The framework is built......The article presents a framework for product knowledge and product related knowledge which can be used to support the product modelling process which is needed for developing IT systems. These IT systems are important tools for many companies when they aim at achieving mass customization...

  16. Applying the Nominal Response Model within a Longitudinal Framework to Construct the Positive Family Relationships Scale

    Science.gov (United States)

    Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle

    2015-01-01

    A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…

  17. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  18. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  19. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    Science.gov (United States)

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation).…

  20. Risk Assessment of Bioaccumulation Substances. Part II: Description of a Model Framework

    NARCIS (Netherlands)

    Tamis, J.E.; Vries, de P.; Karman, C.C.

    2009-01-01

    This report provides a proposal for a framework for risk assessment of bioaccumulative substances, either from produced water discharges or present as background contamination. The proposed framework is such that it is compatible to the current EIF risk assessment models that are used in the

  1. A framework for performance evaluation of model-based optical trackers

    NARCIS (Netherlands)

    Smit, F.A.; Liere, van R.

    2008-01-01

    We describe a software framework to evaluate the performance of model-based optical trackers in virtual environments. The framework can be used to evaluate and compare the performance of different trackers under various conditions, to study the effects of varying intrinsic and extrinsic camera

  2. Model-based visual tracking the OpenTL framework

    CERN Document Server

    Panin, Giorgio

    2011-01-01

    This book has two main goals: to provide a unifed and structured overview of this growing field, as well as to propose a corresponding software framework, the OpenTL library, developed by the author and his working group at TUM-Informatik. The main objective of this work is to show, how most real-world application scenarios can be naturally cast into a common description vocabulary, and therefore implemented and tested in a fully modular and scalable way, through the defnition of a layered, object-oriented software architecture.The resulting architecture covers in a seamless way all processin

  3. A model-based Bayesian framework for ECG beat segmentation

    International Nuclear Information System (INIS)

    Sayadi, O; Shamsollahi, M B

    2009-01-01

    The study of electrocardiogram (ECG) waveform amplitudes, timings and patterns has been the subject of intense research, for it provides a deep insight into the diagnostic features of the heart's functionality. In some recent works, a Bayesian filtering paradigm has been proposed for denoising and compression of ECG signals. In this paper, it is shown that this framework may be effectively used for ECG beat segmentation and extraction of fiducial points. Analytic expressions for the determination of points and intervals are derived and evaluated on various real ECG signals. Simulation results show that the method can contribute to and enhance the clinical ECG beat segmentation performance

  4. Item and response-category functioning of the Persian version of the KIDSCREEN-27: Rasch partial credit model

    Directory of Open Access Journals (Sweden)

    Jafari Peyman

    2012-10-01

    Full Text Available Abstract Background The purpose of the study was to determine whether the Persian version of the KIDSCREEN-27 has the optimal number of response category to measure health-related quality of life (HRQoL in children and adolescents. Moreover, we aimed to determine if all the items contributed adequately to their own domain. Findings The Persian version of the KIDSCREEN-27 was completed by 1083 school children and 1070 of their parents. The Rasch partial credit model (PCM was used to investigate item statistics and ordering of response categories. The PCM showed that no item was misfitting. The PCM also revealed that, successive response categories for all items were located in the expected order except for category 1 in self- and proxy-reports. Conclusions Although Rasch analysis confirms that all the items belong to their own underlying construct, response categories should be reorganized and evaluated in further studies, especially in children with chronic conditions.

  5. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    Science.gov (United States)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  6. UNSAT-H Version 3.0: Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    International Nuclear Information System (INIS)

    Fayer, M.J.

    2000-01-01

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. During the last 4 years, the UNSAT-H model received support from the Immobilized Waste Program (IWP) of the Hanford Site's River Protection Project. This program is designing and assessing the performance of on-site disposal facilities to receive radioactive wastes that are currently stored in single- and double-shell tanks at the Hanford Site (LMHC 1999). The IWP is interested in estimates of recharge rates for current conditions and long-term scenarios involving the vadose zone disposal of tank wastes. Simulation modeling with UNSAT-H is one of the methods being used to provide those estimates (e.g., Rockhold et al. 1995; Fayer et al. 1999). To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H model addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow as one-dimensional processes. The UNSAT-H model simulates liquid water flow using Richards' equation (Richards 1931), water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H .Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an, enhanced-capability update of UNSAT-H Version 2.0 (Fayer and Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple-year simulation capability, and general enhancements

  7. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    Directory of Open Access Journals (Sweden)

    Paweł Sitek

    2016-01-01

    Full Text Available This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs and constraint optimization problems (COPs. Two paradigms, CLP (constraint logic programming and MP (mathematical programming, are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework automatically generates CSP and COP models based on current values of data instances, questions asked by a user, and set of predicates and facts of the problem being modeled, which altogether constitute a knowledge database for the given problem. This dynamic generation of dedicated models, based on the knowledge base, together with the parameters changing externally, for example, the user’s questions, is the implementation of the autonomous search concept. The models are solved using the internal or external solvers integrated with the framework. The architecture of the framework as well as its implementation outline is also included in the paper. The effectiveness of the framework regarding the modeling and solution search is assessed through the illustrative examples relating to scheduling problems with additional constrained resources.

  8. eTOXlab, an open source modeling framework for implementing predictive models in production environments.

    Science.gov (United States)

    Carrió, Pau; López, Oriol; Sanz, Ferran; Pastor, Manuel

    2015-01-01

    Computational models based in Quantitative-Structure Activity Relationship (QSAR) methodologies are widely used tools for predicting the biological properties of new compounds. In many instances, such models are used as a routine in the industry (e.g. food, cosmetic or pharmaceutical industry) for the early assessment of the biological properties of new compounds. However, most of the tools currently available for developing QSAR models are not well suited for supporting the whole QSAR model life cycle in production environments. We have developed eTOXlab; an open source modeling framework designed to be used at the core of a self-contained virtual machine that can be easily deployed in production environments, providing predictions as web services. eTOXlab consists on a collection of object-oriented Python modules with methods mapping common tasks of standard modeling workflows. This framework allows building and validating QSAR models as well as predicting the properties of new compounds using either a command line interface or a graphic user interface (GUI). Simple models can be easily generated by setting a few parameters, while more complex models can be implemented by overriding pieces of the original source code. eTOXlab benefits from the object-oriented capabilities of Python for providing high flexibility: any model implemented using eTOXlab inherits the features implemented in the parent model, like common tools and services or the automatic exposure of the models as prediction web services. The particular eTOXlab architecture as a self-contained, portable prediction engine allows building models with confidential information within corporate facilities, which can be safely exported and used for prediction without disclosing the structures of the training series. The software presented here provides full support to the specific needs of users that want to develop, use and maintain predictive models in corporate environments. The technologies used by e

  9. Simulations of the Mid-Pliocene Warm Period Using Two Versions of the NASA-GISS ModelE2-R Coupled Model

    Science.gov (United States)

    Chandler, M. A.; Sohl, L. E.; Jonas, J. A.; Dowsett, H. J.; Kelley, M.

    2013-01-01

    The mid-Pliocene Warm Period (mPWP) bears many similarities to aspects of future global warming as projected by the Intergovernmental Panel on Climate Change (IPCC, 2007). Both marine and terrestrial data point to high-latitude temperature amplification, including large decreases in sea ice and land ice, as well as expansion of warmer climate biomes into higher latitudes. Here we present our most recent simulations of the mid-Pliocene climate using the CMIP5 version of the NASAGISS Earth System Model (ModelE2-R). We describe the substantial impact associated with a recent correction made in the implementation of the Gent-McWilliams ocean mixing scheme (GM), which has a large effect on the simulation of ocean surface temperatures, particularly in the North Atlantic Ocean. The effect of this correction on the Pliocene climate results would not have been easily determined from examining its impact on the preindustrial runs alone, a useful demonstration of how the consequences of code improvements as seen in modern climate control runs do not necessarily portend the impacts in extreme climates.Both the GM-corrected and GM-uncorrected simulations were contributed to the Pliocene Model Intercomparison Project (PlioMIP) Experiment 2. Many findings presented here corroborate results from other PlioMIP multi-model ensemble papers, but we also emphasize features in the ModelE2-R simulations that are unlike the ensemble means. The corrected version yields results that more closely resemble the ocean core data as well as the PRISM3D reconstructions of the mid-Pliocene, especially the dramatic warming in the North Atlantic and Greenland-Iceland-Norwegian Sea, which in the new simulation appears to be far more realistic than previously found with older versions of the GISS model. Our belief is that continued development of key physical routines in the atmospheric model, along with higher resolution and recent corrections to mixing parameterisations in the ocean model, have led

  10. Unified and Modular Modeling and Functional Verification Framework of Real-Time Image Signal Processors

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2016-01-01

    Full Text Available In VLSI industry, image signal processing algorithms are developed and evaluated using software models before implementation of RTL and firmware. After the finalization of the algorithm, software models are used as a golden reference model for the image signal processor (ISP RTL and firmware development. In this paper, we are describing the unified and modular modeling framework of image signal processing algorithms used for different applications such as ISP algorithms development, reference for hardware (HW implementation, reference for firmware (FW implementation, and bit-true certification. The universal verification methodology- (UVM- based functional verification framework of image signal processors using software reference models is described. Further, IP-XACT based tools for automatic generation of functional verification environment files and model map files are described. The proposed framework is developed both with host interface and with core using virtual register interface (VRI approach. This modeling and functional verification framework is used in real-time image signal processing applications including cellphone, smart cameras, and image compression. The main motivation behind this work is to propose the best efficient, reusable, and automated framework for modeling and verification of image signal processor (ISP designs. The proposed framework shows better results and significant improvement is observed in product verification time, verification cost, and quality of the designs.

  11. MoVES - A Framework for Modelling and Verifying Embedded Systems

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2009-01-01

    The MoVES framework is being developed to assist in the early phases of embedded systems design. A system is modelled as an application running on an execution platform. The application is modelled through the individual tasks, and the execution platform is modelled through the processing elements...... examples, how MoVES can be used to model and analyze embedded systems....

  12. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    Science.gov (United States)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  13. Finite element model updating using bayesian framework and modal properties

    CSIR Research Space (South Africa)

    Marwala, T

    2005-01-01

    Full Text Available Finite element (FE) models are widely used to predict the dynamic characteristics of aerospace structures. These models often give results that differ from measured results and therefore need to be updated to match measured results. Some...

  14. Framework Model for Database Replication within the Availability Zones

    OpenAIRE

    Al-Mughrabi, Ala'a Atallah; Owaied, Hussein

    2013-01-01

    This paper presents a proposed model for database replication model in private cloud availability regions, which is an enhancement of the SQL Server AlwaysOn Layers of Protection Model presents by Microsoft in 2012. The enhancement concentrates in the database replication for private cloud availability regions through the use of primary and secondary servers. The processes of proposed model during the client send Write/Read Request to the server, in synchronous and semi synchronous replicatio...

  15. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  16. A Generic Software Framework for Data Assimilation and Model Calibration

    NARCIS (Netherlands)

    Van Velzen, N.

    2010-01-01

    The accuracy of dynamic simulation models can be increased by using observations in conjunction with a data assimilation or model calibration algorithm. However, implementing such algorithms usually increases the complexity of the model software significantly. By using concepts from object oriented

  17. A model-based framework for the analysis of team communication in nuclear power plants

    International Nuclear Information System (INIS)

    Chung, Yun Hyung; Yoon, Wan Chul; Min, Daihwan

    2009-01-01

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants

  18. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    Science.gov (United States)

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  19. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    Science.gov (United States)

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  20. Categorical Inputs, Sensitivity Analysis, Optimization and Importance Tempering with tgp Version 2, an R Package for Treed Gaussian Process Models

    Directory of Open Access Journals (Sweden)

    Robert B. Gramacy

    2010-02-01

    Full Text Available This document describes the new features in version 2.x of the tgp package for R, implementing treed Gaussian process (GP models. The topics covered include methods for dealing with categorical inputs and excluding inputs from the tree or GP part of the model; fully Bayesian sensitivity analysis for inputs/covariates; sequential optimization of black-box functions; and a new Monte Carlo method for inference in multi-modal posterior distributions that combines simulated tempering and importance sampling. These additions extend the functionality of tgp across all models in the hierarchy: from Bayesian linear models, to classification and regression trees (CART, to treed Gaussian processes with jumps to the limiting linear model. It is assumed that the reader is familiar with the baseline functionality of the package, outlined in the first vignette (Gramacy 2007.

  1. Land-total and Ocean-total Precipitation and Evaporation from a Community Atmosphere Model version 5 Perturbed Parameter Ensemble

    Energy Technology Data Exchange (ETDEWEB)

    Covey, Curt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Trenberth, Kevin E. [National Center for Atmospheric Research, Boulder, CO (United States)

    2016-03-02

    This document presents the large scale water budget statistics of a perturbed input-parameter ensemble of atmospheric model runs. The model is Version 5.1.02 of the Community Atmosphere Model (CAM). These runs are the “C-Ensemble” described by Qian et al., “Parametric Sensitivity Analysis of Precipitation at Global and Local Scales in the Community Atmosphere Model CAM5” (Journal of Advances in Modeling the Earth System, 2015). As noted by Qian et al., the simulations are “AMIP type” with temperature and sea ice boundary conditions chosen to match surface observations for the five year period 2000-2004. There are 1100 ensemble members in addition to one run with default inputparameter values.

  2. Evaluation of dust and trace metal estimates from the Community Multiscale Air Quality (CMAQ model version 5.0

    Directory of Open Access Journals (Sweden)

    K. W. Appel

    2013-07-01

    Full Text Available The Community Multiscale Air Quality (CMAQ model is a state-of-the-science air quality model that simulates the emission, transformation, transport, and fate of the many different air pollutant species that comprise particulate matter (PM, including dust (or soil. The CMAQ model version 5.0 (CMAQv5.0 has several enhancements over the previous version of the model for estimating the emission and transport of dust, including the ability to track the specific elemental constituents of dust and have the model-derived concentrations of those elements participate in chemistry. The latest version of the model also includes a parameterization to estimate emissions of dust due to wind action. The CMAQv5.0 modeling system was used to simulate the entire year 2006 for the continental United States, and the model estimates were evaluated against daily surface-based measurements from several air quality networks. The CMAQ modeling system overall did well replicating the observed soil concentrations in the western United States (mean bias generally around ±0.5 μg m−3; however, the model consistently overestimated the observed soil concentrations in the eastern United States (mean bias generally between 0.5–1.5 μg m−3, regardless of season. The performance of the individual trace metals was highly dependent on the network, species, and season, with relatively small biases for Fe, Al, Si, and Ti throughout the year at the Interagency Monitoring of Protected Visual Environments (IMPROVE sites, while Ca, K, and Mn were overestimated and Mg underestimated. For the urban Chemical Speciation Network (CSN sites, Fe, Mg, and Mn, while overestimated, had comparatively better performance throughout the year than the other trace metals, which were consistently overestimated, including very large overestimations of Al (380%, Ti (370% and Si (470% in the fall. An underestimation of nighttime mixing in the urban areas appears to contribute to the overestimation of

  3. Comparing droplet activation parameterisations against adiabatic parcel models using a novel inverse modelling framework

    Science.gov (United States)

    Partridge, Daniel; Morales, Ricardo; Stier, Philip

    2015-04-01

    Many previous studies have compared droplet activation parameterisations against adiabatic parcel models (e.g. Ghan et al., 2001). However, these have often involved comparisons for a limited number of parameter combinations based upon certain aerosol regimes. Recent studies (Morales et al., 2014) have used wider ranges when evaluating their parameterisations, however, no study has explored the full possible multi-dimensional parameter space that would be experienced by droplet activations within a global climate model (GCM). It is important to be able to efficiently highlight regions of the entire multi-dimensional parameter space in which we can expect the largest discrepancy between parameterisation and cloud parcel models in order to ascertain which regions simulated by a GCM can be expected to be a less accurate representation of the process of cloud droplet activation. This study provides a new, efficient, inverse modelling framework for comparing droplet activation parameterisations to more complex cloud parcel models. To achieve this we couple a Markov Chain Monte Carlo algorithm (Partridge et al., 2012) to two independent adiabatic cloud parcel models and four droplet activation parameterisations. This framework is computationally faster than employing a brute force Monte Carlo simulation, and allows us to transparently highlight which parameterisation provides the closest representation across all aerosol physiochemical and meteorological environments. The parameterisations are demonstrated to perform well for a large proportion of possible parameter combinations, however, for certain key parameters; most notably the vertical velocity and accumulation mode aerosol concentration, large discrepancies are highlighted. These discrepancies correspond for parameter combinations that result in very high/low simulated values of maximum supersaturation. By identifying parameter interactions or regimes within the multi-dimensional parameter space we hope to guide

  4. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  5. A conceptual framework to model long-run qualitative change in the energy system

    OpenAIRE

    Ebersberger, Bernd

    2004-01-01

    A conceptual framework to model long-run qualitative change in the energy system / A. Pyka, B. Ebersberger, H. Hanusch. - In: Evolution and economic complexity / ed. J. Stanley Metcalfe ... - Cheltenham [u.a.] : Elgar, 2004. - S. 191-213

  6. Result Summary for the Area 5 Radioactive Waste Management Site Performance Assessment Model Version 4.110

    International Nuclear Information System (INIS)

    2011-01-01

    Results for Version 4.110 of the Area 5 Radioactive Waste Management Site (RWMS) performance assessment (PA) model are summarized. Version 4.110 includes the fiscal year (FY) 2010 inventory estimate, including a future inventory estimate. Version 4.110 was implemented in GoldSim 10.11(SP4). The following changes have been implemented since the last baseline model, Version 4.105: (1) Updated the inventory and disposal unit configurations with data through the end of FY 2010. (1) Implemented Federal Guidance Report 13 Supplemental CD dose conversion factors (U.S. Environmental Protection Agency, 1999). Version 4.110 PA results comply with air pathway and all-pathways annual total effective dose (TED) performance objectives (Tables 2 and 3, Figures 1 and 2). Air pathways results decrease moderately for all scenarios. The time of the maximum for the air pathway open rangeland scenario shifts from 1,000 to 100 years (y). All-pathways annual TED increases for all scenarios except the resident scenario. The maximum member of public all-pathways dose occurs at 1,000 y for the resident farmer scenario. The resident farmer dose was predominantly due to technetium-99 (Tc-99) (82 percent) and lead-210 (Pb-210) (13 percent). Pb-210 present at 1,000 y is produced predominantly by radioactive decay of uranium-234 (U-234) present at the time of disposal. All results for the postdrilling and intruder-agriculture scenarios comply with the performance objectives (Tables 4 and 5, Figures 3 and 4). The postdrilling intruder results are similar to Version 4.105 results. The intruder-agriculture results are similar to Version 4.105, except for the Pit 6 Radium Disposal Unit (RaDU). The intruder-agriculture result for the Shallow Land Burial (SLB) disposal units is a significant fraction of the performance objective and exceeds the performance objective at the 95th percentile. The intruder-agriculture dose is due predominantly to Tc-99 (75 percent) and U-238 (9.5 percent). The acute

  7. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  8. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    Science.gov (United States)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; Houze, Robert A.; Xiao, Heng

    2018-02-01

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii) the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. In addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.

  9. National culture and business model change: a framework for successful expansions

    DEFF Research Database (Denmark)

    Dalby, J.; Nielsen, L.S.; Lueg, Rainer

    2014-01-01

    Dalby, J., Nielsen, Lueg, R., L. S., Pedersen, L., Tomoni, A. C. 2014. National culture and business model change: a framework for successful expansions. Journal of Enterprising Culture, 22(4): 379-498.......Dalby, J., Nielsen, Lueg, R., L. S., Pedersen, L., Tomoni, A. C. 2014. National culture and business model change: a framework for successful expansions. Journal of Enterprising Culture, 22(4): 379-498....

  10. A generic framework for individual-based modelling and physical-biological interaction

    DEFF Research Database (Denmark)

    Christensen, Asbjørn; Mariani, Patrizio; Payne, Mark R.

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian...... scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions...

  11. A scalable delivery framework and a pricing model for streaming media with advertisements

    Science.gov (United States)

    Al-Hadrusi, Musab; Sarhan, Nabil J.

    2008-01-01

    This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.

  12. Introducing MERGANSER: A Flexible Framework for Ecological Niche Modeling

    Science.gov (United States)

    Klawonn, M.; Dow, E. M.

    2015-12-01

    Ecological Niche Modeling (ENM) is a collection of techniques to find a "fundamental niche", the range of environmental conditions suitable for a species' survival in the absence of inter-species interactions, given a set of environmental parameters. Traditional approaches to ENM face a number of obstacles including limited data accessibility, data management problems, computational costs, interface usability, and model validation. The MERGANSER system, which stands for Modeling Ecological Residency Given A Normalized Set of Environmental Records, addresses these issues through powerful data persistence and flexible data access, coupled with a clear presentation of results and fine-tuned control over model parameters. MERGANSER leverages data measuring 72 weather related phenomena, land cover, soil type, population, species occurrence, general species information, and elevation, totaling over 1.5 TB of data. To the best of the authors' knowledge, MERGANSER uses higher-resolution spatial data sets than previously published models. Since MERGANSER stores data in an instance of Apache SOLR, layers generated in support of niche models are accessible to users via simplified Apache Lucene queries. This is made even simpler via an HTTP front end that generates Lucene queries automatically. Specifically, a user need only enter the name of a place and a species to run a model. Using this approach to synthesizing model layers, the MERGANSER system has successfully reproduced previously published niche model results with a simplified user experience. Input layers for the model are generated dynamically using OpenStreetMap and SOLR's spatial search functionality. Models are then run using either user-specified or automatically determined parameters after normalizing them into a common grid. Finally, results are visualized in the web interface, which allows for quick validation. Model results and all surrounding metadata are also accessible to the user for further study.

  13. Scoping review identifies significant number of knowledge translation theories, models and frameworks with limited use.

    Science.gov (United States)

    Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E

    2018-04-13

    To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.

  14. Towards a Two-Dimensional Framework for User Models

    NARCIS (Netherlands)

    Vrieze, P.T. de; Bommel, P. van; Klok, J.; Weide, Th.P. van der; Gensel, Jérôme; Sèdes, Florence; Martin, Hervé

    2003-01-01

    The focus of this paper is user modeling in the context of personalisation of information systems. Such a personalisation is essential to give users the feeling that the system is easily accessible. The way this adaptive personalization works is very dependent on the adaptation model that is

  15. Biochemical Space: A Framework for Systemic Annotation of Biological Models

    Czech Academy of Sciences Publication Activity Database

    Klement, M.; Děd, T.; Šafránek, D.; Červený, Jan; Müller, Stefan; Steuer, Ralf

    2014-01-01

    Roč. 306, JUL (2014), s. 31-44 ISSN 1571-0661 R&D Projects: GA MŠk(CZ) EE2.3.20.0256 Institutional support: RVO:67179843 Keywords : biological models * model annotation * systems biology * cyanobacteria Subject RIV: EH - Ecology, Behaviour

  16. A business model for IPTV service: A dynamic framework

    NARCIS (Netherlands)

    Bouwman, H.; Zhengjia, M.; Duin, P. van der; Limonard, S.

    2008-01-01

    Purpose - The purpose of this paper is to investigate a possible business model for telecom operators for entering the IPTV (digital television) market. Design/methodology/approach - The approach takes the form of a case study, literature search and interviews. Findings - The IPTV business model

  17. The Conceptual Framework of Factors Affecting Shared Mental Model

    Science.gov (United States)

    Lee, Miyoung; Johnson, Tristan; Lee, Youngmin; O'Connor, Debra; Khalil, Mohammed

    2004-01-01

    Many researchers have paid attention to the potentiality and possibility of the shared mental model because it enables teammates to perform their job better by sharing team knowledge, skills, attitudes, dynamics and environments. Even though theoretical and experimental evidences provide a close relationship between the shared mental model and…

  18. An ontological framework for model-based problem-solving

    NARCIS (Netherlands)

    Scholten, H.; Beulens, A.J.M.

    2012-01-01

    Multidisciplinary projects to solve real world problems of increasing complexity are more and more plagued by obstacles such as miscommunication between modellers with different disciplinary backgrounds and bad modelling practices. To tackle these difficulties, a body of knowledge on problems, on

  19. Integrating environmental component models. Development of a software framework

    NARCIS (Netherlands)

    Schmitz, O.

    2014-01-01

    Integrated models consist of interacting component models that represent various natural and social systems. They are important tools to improve our understanding of environmental systems, to evaluate cause–effect relationships of human–natural interactions, and to forecast the behaviour of

  20. Deep Modeling: Circuit Characterization Using Theory Based Models in a Data Driven Framework

    Energy Technology Data Exchange (ETDEWEB)

    Bolme, David S [ORNL; Mikkilineni, Aravind K [ORNL; Rose, Derek C [ORNL; Yoginath, Srikanth B [ORNL; Holleman, Jeremy [University of Tennessee, Knoxville (UTK); Judy, Mohsen [University of Tennessee, Knoxville (UTK), Department of Electrical Engineering and Computer Science

    2017-01-01

    Analog computational circuits have been demonstrated to provide substantial improvements in power and speed relative to digital circuits, especially for applications requiring extreme parallelism but only modest precision. Deep machine learning is one such area and stands to benefit greatly from analog and mixed-signal implementations. However, even at modest precisions, offsets and non-linearity can degrade system performance. Furthermore, in all but the simplest systems, it is impossible to directly measure the intermediate outputs of all sub-circuits. The result is that circuit designers are unable to accurately evaluate the non-idealities of computational circuits in-situ and are therefore unable to fully utilize measurement results to improve future designs. In this paper we present a technique to use deep learning frameworks to model physical systems. Recently developed libraries like TensorFlow make it possible to use back propagation to learn parameters in the context of modeling circuit behavior. Offsets and scaling errors can be discovered even for sub-circuits that are deeply embedded in a computational system and not directly observable. The learned parameters can be used to refine simulation methods or to identify appropriate compensation strategies. We demonstrate the framework using a mixed-signal convolution operator as an example circuit.

  1. A general framework for modeling growth and division of mammalian cells.

    Science.gov (United States)

    Gauthier, John H; Pohl, Phillip I

    2011-01-06

    Modeling the cell-division cycle has been practiced for many years. As time has progressed, this work has gone from understanding the basic principles to addressing distinct biological problems, e.g., the nature of the restriction point, how checkpoints operate, the nonlinear dynamics of the cell cycle, the effect of localization, etc. Most models consist of coupled ordinary differential equations developed by the researchers, restricted to deal with the interactions of a limited number of molecules. In the future, cell-cycle modeling--and indeed all modeling of complex biologic processes--will increase in scope and detail. A framework for modeling complex cell-biologic processes is proposed here. The framework is based on two constructs: one describing the entire lifecycle of a molecule and the second describing the basic cellular machinery. Use of these constructs allows complex models to be built in a straightforward manner that fosters rigor and completeness. To demonstrate the framework, an example model of the mammalian cell cycle is presented that consists of several hundred differential equations of simple mass action kinetics. The model calculates energy usage, amino acid and nucleotide usage, membrane transport, RNA synthesis and destruction, and protein synthesis and destruction for 33 proteins to give an in-depth look at the cell cycle. The framework presented here addresses how to develop increasingly descriptive models of complex cell-biologic processes. The example model of cellular growth and division constructed with the framework demonstrates that large structured models can be created with the framework, and these models can generate non-trivial descriptions of cellular processes. Predictions from the example model include those at both the molecular level--e.g., Wee1 spontaneously reactivates--and at the system level--e.g., pathways for timing-critical processes must shut down redundant pathways. A future effort is to automatically estimate

  2. Interaction between GIS and hydrologic model: A preliminary approach using ArcHydro Framework Data Model

    Directory of Open Access Journals (Sweden)

    Silvio Jorge C. Simões

    2013-08-01

    Full Text Available In different regions of Brazil, population growth and economic development can degrade water quality, compromising watershed health and human supply. Because of its ability to combine spatial and temporal data in the same environment and to create water resources management (WRM models, the Geographical Information System (GIS is a powerful tool for managing water resources, preventing floods and estimating water supply. This paper discusses the integration between GIS and hydrological models and presents a case study relating to the upper section of the Paraíba do Sul Basin (Sao Paulo State portion, situated in the Southeast of Brazil. The case study presented in this paper has a database suitable for the basin’s dimensions, including digitized topographic maps at a 50,000 scale. From an ArcGIS®/ArcHydro Framework Data Model, a geometric network was created to produce different raster products. This first grid derived from the digital elevation model grid (DEM is the flow direction map followed by flow accumulation, stream and catchment maps. The next steps in this research are to include the different multipurpose reservoirs situated along the Paraíba do Sul River and to incorporate rainfall time series data in ArcHydro to build a hydrologic data model within a GIS environment in order to produce a comprehensive spatial temporal model.

  3. Intercomparison of Streamflow Simulations between WRF-Hydro and Hydrology Laboratory-Research Distributed Hydrologic Model Frameworks

    Science.gov (United States)

    KIM, J.; Smith, M. B.; Koren, V.; Salas, F.; Cui, Z.; Johnson, D.

    2017-12-01

    The National Oceanic and Atmospheric Administration (NOAA)-National Weather Service (NWS) developed the Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM) framework as an initial step towards spatially distributed modeling at River Forecast Centers (RFCs). Recently, the NOAA/NWS worked with the National Center for Atmospheric Research (NCAR) to implement the National Water Model (NWM) for nationally-consistent water resources prediction. The NWM is based on the WRF-Hydro framework and is run at a 1km spatial resolution and 1-hour time step over the contiguous United States (CONUS) and contributing areas in Canada and Mexico. In this study, we compare streamflow simulations from HL-RDHM and WRF-Hydro to observations from 279 USGS stations. For streamflow simulations, HL-RDHM is run on 4km grids with the temporal resolution of 1 hour for a 5-year period (Water Years 2008-2012), using a priori parameters provided by NOAA-NWS. The WRF-Hydro streamflow simulations for the same time period are extracted from NCAR's 23 retrospective run of the NWM (version 1.0) over CONUS based on 1km grids. We choose 279 USGS stations which are relatively less affected by dams or reservoirs, in the domains of six different RFCs. We use the daily average values of simulations and observations for the convenience of comparison. The main purpose of this research is to evaluate how HL-RDHM and WRF-Hydro perform at USGS gauge stations. We compare daily time-series of observations and both simulations, and calculate the error values using a variety of error functions. Using these plots and error values, we evaluate the performances of HL-RDHM and WRF-Hydro models. Our results show a mix of model performance across geographic regions.

  4. A framework to establish credibility of computational models in biology.

    Science.gov (United States)

    Patterson, Eann A; Whelan, Maurice P

    2017-10-01

    Computational models in biology and biomedical science are often constructed to aid people's understanding of phenomena or to inform decisions with socioeconomic consequences. Model credibility is the willingness of people to trust a model's predictions and is often difficult to establish for computational biology models. A 3 × 3 matrix has been proposed to allow such models to be categorised with respect to their testability and epistemic foundation in order to guide the selection of an appropriate process of validation to supply evidence to establish credibility. Three approaches to validation are identified that can be deployed depending on whether a model is deemed untestable, testable or lies somewhere in between. In the latter two cases, the validation process involves the quantification of uncertainty which is a key output. The issues arising due to the complexity and inherent variability of biological systems are discussed and the creation of 'digital twins' proposed as a means to alleviate the issues and provide a more robust, transparent and traceable route to model credibility and acceptance. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. A Generalized Framework for Modeling Next Generation 911 Implementations.

    Energy Technology Data Exchange (ETDEWEB)

    Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka; Jrad, Ahmad M.; Mitchell, Roger

    2018-02-01

    This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We found that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .

  6. Calculation of real optical model potential for heavy ions in the framework of the folding model

    International Nuclear Information System (INIS)

    Goncharov, S.A.; Timofeyuk, N.K.; Kazacha, G.S.

    1987-01-01

    The code for calculation of a real optical model potential in the framework of the folding model is realized. The program of numerical Fourier-Bessel transformation based on Filon's integration rule is used. The accuracy of numerical calculations is ∼ 10 -4 for a distance interval up to a bout (2.5-3) times the size of nuclei. The potentials are calculated for interactions of 3,4 He with nuclei from 9 Be to 27 Al with different effective NN-interactions and densities obtained from electron scattering data. Calculated potentials are similar to phenomenological potentials in Woods-Saxon form. With calculated potentials the available elastic scattering data for the considered nuclei in the energy interval 18-56 MeV are analysed. The needed renormalizations for folding potentials are < or approx. 20%

  7. Extending the Modelling Framework for Gas-Particle Systems

    DEFF Research Database (Denmark)

    Rosendahl, Lasse Aistrup

    , with very good results. Single particle combustion has been tested using a number of different particle combustion models applied to coal and straw particles. Comparing the results of these calculations to measurements on straw burnout, the results indicate that for straw, existing heterogeneous combustion...... models perform well, and may be used in high temperature ranges. Finally, the particle tracking and combustion model is applied to an existing coal and straw co- fuelled burner. The results indicate that again, the straw follows very different trajectories than the coal particles, and also that burnout...

  8. Update of the Polar SWIFT model for polar stratospheric ozone loss (Polar SWIFT version 2

    Directory of Open Access Journals (Sweden)

    I. Wohltmann

    2017-07-01

    Full Text Available The Polar SWIFT model is a fast scheme for calculating the chemistry of stratospheric ozone depletion in polar winter. It is intended for use in global climate models (GCMs and Earth system models (ESMs to enable the simulation of mutual interactions between the ozone layer and climate. To date, climate models often use prescribed ozone fields, since a full stratospheric chemistry scheme is computationally very expensive. Polar SWIFT is based on a set of coupled differential equations, which simulate the polar vortex-averaged mixing ratios of the key species involved in polar ozone depletion on a given vertical level. These species are O3, chemically active chlorine (ClOx, HCl, ClONO2 and HNO3. The only external input parameters that drive the model are the fraction of the polar vortex in sunlight and the fraction of the polar vortex below the temperatures necessary for the formation of polar stratospheric clouds. Here, we present an update of the Polar SWIFT model introducing several improvements over the original model formulation. In particular, the model is now trained on vortex-averaged reaction rates of the ATLAS Chemistry and Transport Model, which enables a detailed look at individual processes and an independent validation of the different parameterizations contained in the differential equations. The training of the original Polar SWIFT model was based on fitting complete model runs to satellite observations and did not allow for this. A revised formulation of the system of differential equations is developed, which closely fits vortex-averaged reaction rates from ATLAS that represent the main chemical processes influencing ozone. In addition, a parameterization for the HNO3 change by denitrification is included. The rates of change of the concentrations of the chemical species of the Polar SWIFT model are purely chemical rates of change in the new version, whereas in the original Polar SWIFT model, they included a transport effect

  9. Implementation of methane cycling for deep time, global warming simulations with the DCESS Earth System Model (Version 1.2)

    DEFF Research Database (Denmark)

    Shaffer, Gary; Villanueva, Esteban Fernández; Rondanelli, Roberto

    2017-01-01

    Geological records reveal a number of ancient, large and rapid negative excursions of carbon-13 isotope. Such excursions can only be explained by massive injections of depleted carbon to the Earth System over a short duration. These injections may have forced strong global warming events, sometimes....... With this improved DCESS model version and paleo-reconstructions, we are now better armed to gauge the amounts, types, time scales and locations of methane injections driving specific, observed deep time, global warming events....

  10. Vortex dynamics in nonrelativistic version of Abelian Higgs model: Effects of the medium on the vortex motion

    Directory of Open Access Journals (Sweden)

    Kozhevnikov Arkadii

    2016-01-01

    Full Text Available The closed vortex dynamics is considered in the nonrelativistic version of the Abelian Higgs Model. The effect of the exchange of excitations propagating in the medium on the vortex string motion is taken into account. The obtained are the effective action and the equation of motion both including the exchange of the propagating excitations between the distant segments of the vortex and the possibility of its interaction with the static fermion asymmetric background. They are applied to the derivation of the time dependence of the basic geometrical contour characteristics.

  11. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  12. Model Adaptation for Prognostics in a Particle Filtering Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated....

  13. A parametric framework for modelling of bioelectrical signals

    CERN Document Server

    Mughal, Yar Muhammad

    2016-01-01

    This book examines non-invasive, electrical-based methods for disease diagnosis and assessment of heart function. In particular, a formalized signal model is proposed since this offers several advantages over methods that rely on measured data alone. By using a formalized representation, the parameters of the signal model can be easily manipulated and/or modified, thus providing mechanisms that allow researchers to reproduce and control such signals. In addition, having such a formalized signal model makes it possible to develop computer tools that can be used for manipulating and understanding how signal changes result from various heart conditions, as well as for generating input signals for experimenting with and evaluating the performance of e.g. signal extraction methods. The work focuses on bioelectrical information, particularly electrical bio-impedance (EBI). Once the EBI has been measured, the corresponding signals have to be modelled for analysis. This requires a structured approach in order to move...

  14. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  15. Public–private partnership conceptual framework and models for the ...

    African Journals Online (AJOL)

    investment and inefficiencies within the public expenditure management systems are particularly detrimental, e.g., there .... a consolidation of different views, ideas, perceptions and ... contingency) exists between the PPP models key financial.

  16. A Flexible Framework Hydrological Informatic Modeling System - HIMS

    Science.gov (United States)

    WANG, L.; Wang, Z.; Changming, L.; Li, J.; Bai, P.

    2017-12-01

    Simulating water cycling process temporally and spatially fitting for the characteristics of the study area was important for floods prediction and streamflow simulation with high accuracy, as soil properties, land scape, climate, and land managements were the critical factors influencing the non-linear relationship of rainfall-runoff at watershed scales. Most existing hydrological models cannot simulate water cycle process at different places with customized mechanisms with fixed single structure and mode. This study develops Hydro-Informatic Modeling System (HIMS) model with modular of each critical hydrological process with multiple choices for various scenarios to solve this problem. HIMS has the structure accounting for two runoff generation mechanisms of infiltration excess and saturation excess and estimated runoff with different methods including Time Variance Gain Model (TVGM), LCM which has good performance at ungauged areas, besides the widely used Soil Conservation Service-Curve Number (SCS-CN) method. Channel routing model contains the most widely used Muskingum, and kinematic wave equation with new solving method. HIMS model performance with its symbolic runoff generation model LCM was evaluated through comparison with the observed streamflow datasets of Lasha river watershed at hourly, daily, and monthly time steps. Comparisons between simulational and obervational streamflows were found with NSE higher than 0.87 and WE within ±20%. Water balance analysis about precipitation, streamflow, actual evapotranspiration (ET), and soil moisture change was conducted temporally at annual time step and it has been proved that HIMS model performance was reliable through comparison with literature results at the Lhasa River watershed.

  17. Model Adaptation for Prognostics in a Particle Filtering Framework

    Directory of Open Access Journals (Sweden)

    Bhaskar Saha

    2011-01-01

    Full Text Available One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the “curse of dimensionality”, i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for “well-designed” particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion and Li-Polymer batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  18. Model Adaptation for Prognostics in a Particle Filtering Framework

    Science.gov (United States)

    Saha, Bhaskar; Goebel, Kai Frank

    2011-01-01

    One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the "curse of dimensionality", i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for "well-designed" particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  19. Design theoretic analysis of three system modeling frameworks.

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, Michael James

    2007-05-01

    This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.

  20. Preparatory planning framework for Created Out of Mind: Shaping perceptions of dementia through art and science [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Emilie Brotherhood

    2017-11-01

    Full Text Available Created Out of Mind is an interdisciplinary project, comprised of individuals from arts, social sciences, music, biomedical sciences, humanities and operational disciplines. Collaboratively we are working to shape perceptions of dementias through the arts and sciences, from a position within the Wellcome Collection. The Collection is a public building, above objects and archives, with a porous relationship between research, museum artefacts, and the public.  This pre-planning framework will act as an introduction to Created Out of Mind. The framework explains the rationale and aims of the project, outlines our focus for the project, and explores a number of challenges we have encountered by virtue of working in this way.

  1. Introducing a boreal wetland model within the Earth System model framework

    Science.gov (United States)

    Getzieh, R. J.; Brovkin, V.; Reick, C.; Kleinen, T.; Raddatz, T.; Raivonen, M.; Sevanto, S.

    2009-04-01

    Wetlands of the northern high latitudes with their low temperatures and waterlogged conditions are prerequisite for peat accumulation. They store at least 25% of the global soil organic carbon and constitute currently the largest natural source of methane. These boreal and subarctic peat carbon pools are sensitive to climate change since the ratio of carbon sequestration and emission is closely dependent on hydrology and temperature. Global biogeochemistry models used for simulations of CO2 dynamics in the past and future climates usually ignore changes in the peat storages. Our approach aims at the evaluation of the boreal wetland feedback to climate through the CO2 and CH4 fluxes on decadal to millennial time scales. A generic model of organic matter accumulation and decay in boreal wetlands is under development in the MPI for Meteorology in cooperation with the University of Helsinki. Our approach is to develop a wetland model which is consistent with the physical and biogeochemical components of the land surface module JSBACH as a part of the Earth System model framework ECHAM5-MPIOM-JSBACH. As prototypes, we use modelling approach by Frolking et al. (2001) for the peat dynamics and the wetland model by Wania (2007) for vegetation cover and plant productivity. An initial distribution of wetlands follows the GLWD-3 map by Lehner and Döll (2004). First results of the modelling approach will be presented. References: Frolking, S. E., N. T. Roulet, T. R. Moore, P. J. H. Richard, M. Lavoie and S. D. Muller (2001): Modeling Northern Peatland Decomposition and Peat Accumulation, Ecosystems, 4, 479-498. Lehner, B., Döll P. (2004): Development and validation of a global database of lakes, reservoirs and wetlands. Journal of Hydrology 296 (1-4), 1-22. Wania, R. (2007): Modelling northern peatland land surface processes, vegetation dynamics and methane emissions. PhD thesis, University of Bristol, 122 pp.

  2. A Reusable Framework for Regional Climate Model Evaluation

    Science.gov (United States)

    Hart, A. F.; Goodale, C. E.; Mattmann, C. A.; Lean, P.; Kim, J.; Zimdars, P.; Waliser, D. E.; Crichton, D. J.

    2011-12-01

    Climate observations are currently obtained through a diverse network of sensors and platforms that include space-based observatories, airborne and seaborne platforms, and distributed, networked, ground-based instruments. These global observational measurements are critical inputs to the efforts of the climate modeling community and can provide a corpus of data for use in analysis and validation of climate models. The Regional Climate Model Evaluation System (RCMES) is an effort currently being undertaken to address the challenges of integrating this vast array of observational climate data into a coherent resource suitable for performing model analysis at the regional level. Developed through a collaboration between the NASA Jet Propulsion Laboratory (JPL) and the UCLA Joint Institute for Regional Earth System Science and Engineering (JIFRESSE), the RCMES uses existing open source technologies (MySQL, Apache Hadoop, and Apache OODT), to construct a scalable, parametric, geospatial data store that incorporates decades of observational data from a variety of NASA Earth science missions, as well as other sources into a consistently annotated, highly available scientific resource. By eliminating arbitrary partitions in the data (individual file boundaries, differing file formats, etc), and instead treating each individual observational measurement as a unique, geospatially referenced data point, the RCMES is capable of transforming large, heterogeneous collections of disparate observational data into a unified resource suitable for comparison to climate model output. This facility is further enhanced by the availability of a model evaluation toolkit which consists of a set of Python libraries, a RESTful web service layer, and a browser-based graphical user interface that allows for orchestration of model-to-data comparisons by composing them visually through web forms. This combination of tools and interfaces dramatically simplifies the process of interacting with and

  3. Using social capital to construct a conceptual International Classification of Functioning, Disability, and Health Children and Youth version-based framework for stronger inclusive education policies in Europe.

    Science.gov (United States)

    Maxwell, Gregor; Koutsogeorgou, Eleni

    2012-02-01

    Inclusive education is part of social inclusion; therefore, social capital can be linked to an inclusive education policy and practice. This association is explored in this article, and a practical measure is proposed. Specifically, the World Health Organization's International Classification of Functioning, Disability and Health Children and Youth Version (ICF-CY) is proposed as the link between social capital and inclusive education. By mapping participation and trust indicators of social capital to the ICF-CY and by using the Matrix to Analyse Functioning in Education Systems (MAFES) to analyze the functioning of inclusive education policies and systems, a measure for stronger inclusive education policies is proposed. Such a tool can be used for policy planning and monitoring to ensure better inclusive education environments. In conclusion, combining enhanced social capital linked to stronger inclusive education policies, by using the ICF-CY, can lead to better health and well-being for all.

  4. A framework for modelling the complexities of food and water security under globalisation

    Science.gov (United States)

    Dermody, Brian J.; Sivapalan, Murugesu; Stehfest, Elke; van Vuuren, Detlef P.; Wassen, Martin J.; Bierkens, Marc F. P.; Dekker, Stefan C.

    2018-01-01

    We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  5. framework for modelling the complexities of food and water security under globalisation

    Directory of Open Access Journals (Sweden)

    B. J. Dermody

    2018-01-01

    Full Text Available We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  6. National water, food, and trade modeling framework: The case of Egypt.

    Science.gov (United States)

    Abdelkader, A; Elshorbagy, A; Tuninetti, M; Laio, F; Ridolfi, L; Fahmy, H; Hoekstra, A Y

    2018-05-22

    This paper introduces a modeling framework for the analysis of real and virtual water flows at national scale. The framework has two components: (1) a national water model that simulates agricultural, industrial and municipal water uses, and available water and land resources; and (2) an international virtual water trade model that captures national virtual water exports and imports related to trade in crops and animal products. This National Water, Food & Trade (NWFT) modeling framework is applied to Egypt, a water-poor country and the world's largest importer of wheat. Egypt's food and water gaps and the country's food (virtual water) imports are estimated over a baseline period (1986-2013) and projected up to 2050 based on four scenarios. Egypt's food and water gaps are growing rapidly as a result of steep population growth and limited water resources. The NWFT modeling framework shows the nexus of the population dynamics, water uses for different sectors, and their compounding effects on Egypt's food gap and water self-sufficiency. The sensitivity analysis reveals that for solving Egypt's water and food problem non-water-based solutions like educational, health, and awareness programs aimed at lowering population growth will be an essential addition to the traditional water resources development solution. Both the national and the global models project similar trends of Egypt's food gap. The NWFT modeling framework can be easily adapted to other nations and regions. Copyright © 2018. Published by Elsevier B.V.

  7. Stochastic programming framework for Lithuanian pension payout modelling

    Directory of Open Access Journals (Sweden)

    Audrius Kabašinskas

    2014-12-01

    Full Text Available The paper provides a scientific approach to the problem of selecting a pension fund by taking into account some specific characteristics of the Lithuanian Republic (LR pension accumulation system. The decision making model, which can be used to plan a long-term pension accrual of the Lithuanian Republic (LR citizens, in an optimal way is presented. This model focuses on factors that influence the sustainability of the pension system selection under macroeconomic, social and demographic uncertainty. The model is formalized as a single stage stochastic optimization problem where the long-term optimal strategy can be obtained based on the possible scenarios generated for a particular participant. Stochastic programming methods allow including the pension fund rebalancing moment and direction of investment, and taking into account possible changes of personal income, changes of society and the global financial market. The collection of methods used to generate scenario trees was found useful to solve strategic planning problems.

  8. A GBT-framework towards modal modelling of steel structures

    DEFF Research Database (Denmark)

    Hansen, Anders Bau; Jönsson, Jeppe

    2017-01-01

    In modern structural steel frame design, the modelling of joints between beams and columns are based on very simple assumptions. The joints are most often assumed to behave as a perfect hinge or as a rigid joint. This means that in the overall static analysis relative rotations and changes...... the rotational stiffness of a connection. Based on a modelling of any beam-to-column joint using finite shell elements and springs for single components such as bolts, it is the primary hypothesis that it is possible to formulate a generalized connection model with few degrees of freedom related to a relevant...... set of deformation modes. This hypothesis is based on the idea of modal decomposition performed in generalized beam theories (GBT). The question is – is it possible to formulate an eigenvalue problem with a solution corresponding to mode shapes for the deformation of the joint by using the finite...

  9. gamboostLSS: An R Package for Model Building and Variable Selection in the GAMLSS Framework

    OpenAIRE

    Hofner, Benjamin; Mayr, Andreas; Schmid, Matthias

    2014-01-01

    Generalized additive models for location, scale and shape are a flexible class of regression models that allow to model multiple parameters of a distribution function, such as the mean and the standard deviation, simultaneously. With the R package gamboostLSS, we provide a boosting method to fit these models. Variable selection and model choice are naturally available within this regularized regression framework. To introduce and illustrate the R package gamboostLSS and its infrastructure, we...

  10. Recent extensions and use of the statistical model code EMPIRE-II - version: 2.17 Millesimo

    International Nuclear Information System (INIS)

    Herman, M.

    2003-01-01

    This lecture notes describe new features of the modular code EMPIRE-2.17 designed to perform comprehensive calculations of nuclear reactions using variety of nuclear reaction models. Compared to the version 2.13, the current release has been extended by including Coupled-Channel mechanism, exciton model, Monte Carlo approach to preequilibrium emission, use of microscopic level densities, widths fluctuation correction, detailed calculation of the recoil spectra, and powerful plotting capabilities provided by the ZVView package. The second part of this lecture concentrates on the use of the code in practical calculations, with emphasis on the aspects relevant to nuclear data evaluation. In particular, adjusting model parameters is discussed in details. (author)

  11. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, M.L. [California State Univ., Sacramento, CA (United States); Pollock, K.H. [North Carolina State Univ., Raleigh, NC (United States)

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds.

  12. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    International Nuclear Information System (INIS)

    Morrison, M.L.; Pollock, K.H.

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds

  13. Intrinsic flexibility of porous materials; theory, modelling and the flexibility window of the EMT zeolite framework

    International Nuclear Information System (INIS)

    Fletcher, Rachel E.; Wells, Stephen A.; Leung, Ka Ming; Edwards, Peter P.; Sartbaeva, Asel

    2015-01-01

    Framework materials possess intrinsic flexibility which can be investigated using geometric simulation. We review framework flexibility properties in energy materials and present novel results on the flexibility window of the EMT zeolite framework containing 18-crown-6 ether as a structure directing agent (SDA). Framework materials have structures containing strongly bonded polyhedral groups of atoms connected through their vertices. Typically the energy cost for variations of the inter-polyhedral geometry is much less than the cost of distortions of the polyhedra themselves – as in the case of silicates, where the geometry of the SiO 4 tetrahedral group is much more strongly constrained than the Si—O—Si bridging angle. As a result, framework materials frequently display intrinsic flexibility, and their dynamic and static properties are strongly influenced by low-energy collective motions of the polyhedra. Insight into these motions can be obtained in reciprocal space through the ‘rigid unit mode’ (RUM) model, and in real-space through template-based geometric simulations. We briefly review the framework flexibility phenomena in energy-relevant materials, including ionic conductors, perovskites and zeolites. In particular we examine the ‘flexibility window’ phenomenon in zeolites and present novel results on the flexibility window of the EMT framework, which shed light on the role of structure-directing agents. Our key finding is that the crown ether, despite its steric bulk, does not limit the geometric flexibility of the framework

  14. Surgical model-view-controller simulation software framework for local and collaborative applications.

    Science.gov (United States)

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  15. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    Science.gov (United States)

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  16. Interagency Modeling Atmospheric Assessment Center Local Jurisdiction: IMAAC Operations Framework

    Science.gov (United States)

    2010-03-01

    proposed model ( Daft & Lengel, 1986). All six Ohio LINC Cities were interviewed face- to-face providing the basis for the research evaluating...Cincinnati, DHS should work in partnership with Cincinnati Urban Area Leadership to convene a randomly selected, but statistically-significant, UASI...response system. Internal document. Daft , R. L. & Lengel, R. H. (1986). Organizational Information Requirements, Media Richness and Structural

  17. A Framework for Modelling Connective Tissue Changes in VIIP Syndrome

    Science.gov (United States)

    Ethier, C. R.; Best, L.; Gleason, R.; Mulugeta, L.; Myers, J. G.; Nelson, E. S.; Samuels, B. C.

    2014-01-01

    Insertion of astronauts into microgravity induces a cascade of physiological adaptations, notably including a cephalad fluid shift. Longer-duration flights carry an increased risk of developing Visual Impairment and Intracranial Pressure (VIIP) syndrome, a spectrum of ophthalmic changes including posterior globe flattening, choroidal folds, distension of the optic nerve sheath, kinking of the optic nerve and potentially permanent degradation of visual function. The slow onset of changes in VIIP, their chronic nature, and the similarity of certain clinical features of VIIP to ophthalmic findings in patients with raised intracranial pressure strongly suggest that: (i) biomechanical factors play a role in VIIP, and (ii) connective tissue remodeling must be accounted for if we wish to understand the pathology of VIIP. Our goal is to elucidate the pathophysiology of VIIP and suggest countermeasures based on biomechanical modeling of ocular tissues, suitably informed by experimental data, and followed by validation and verification. We specifically seek to understand the quasi-homeostatic state that evolves over weeks to months in space, during which ocular tissue remodeling occurs. This effort is informed by three bodies of work: (i) modeling of cephalad fluid shifts; (ii) modeling of ophthalmic tissue biomechanics in glaucoma; and (iii) modeling of connective tissue changes in response to biomechanical loading.

  18. Iberian (South American) Model of Judicial Review: Toward Conceptual Framework

    Science.gov (United States)

    Klishas, Andrey A.

    2016-01-01

    The paper explores Latin American countries legislation with the view to identify specific features of South American model of judicial review. The research methodology rests on comparative approach to analyzing national constitutions' provisions and experts' interpretations thereof. The constitutional provisions of Brazil, Peru, Mexico, and…

  19. Framework for Modelling Multiple Input Complex Aggregations for Interactive Installations

    DEFF Research Database (Denmark)

    Padfield, Nicolas; Andreasen, Troels

    2012-01-01

    on fuzzy logic and provides a method for variably balancing interaction and user input with the intention of the artist or director. An experimental design is presented, demonstrating an intuitive interface for parametric modelling of a complex aggregation function. The aggregation function unifies...

  20. The heterogeneous heuristic modeling framework for inferring decision processes

    NARCIS (Netherlands)

    Zhu, W.; Timmermans, H.J.P.; Rasouli, S.; Timmermans, H.J.P.

    2015-01-01

    Purpose – Increasing evidence suggests that choice behaviour in real world may be guided by principles of bounded rationality as opposed to typically assumed fully rational behaviour, based on the principle of utility- maximization. Under such circumstances, conventional rational choice models