WorldWideScience

Sample records for modeling framework enabling

  1. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  2. Software Frameworks for Model Composition

    Directory of Open Access Journals (Sweden)

    Mikel D. Petty

    2014-01-01

    Full Text Available A software framework is an architecture or infrastructure intended to enable the integration and interoperation of software components. Specialized types of software frameworks are those specifically intended to support the composition of models or other components within a simulation system. Such frameworks are intended to simplify the process of assembling a complex model or simulation system from simpler component models as well as to promote the reuse of the component models. Several different types of software frameworks for model composition have been designed and implemented; those types include common library, product line architecture, interoperability protocol, object model, formal, and integrative environment. The various framework types have different components, processes for composing models, and intended applications. In this survey the fundamental terms and concepts of software frameworks for model composition are presented, the different types of such frameworks are explained and compared, and important examples of each type are described.

  3. A framework to promote collective action within the One Health community of practice: Using participatory modelling to enable interdisciplinary, cross-sectoral and multi-level integration

    Directory of Open Access Journals (Sweden)

    Aurelie Binot

    2015-12-01

    The implementation of a One Health (OH approach in this context calls for improved integration among disciplines and improved cross-sectoral collaboration, involving stakeholders at different levels. For sure, such integration is not achieved spontaneously, implies methodological guidelines and has transaction costs. We explore pathways for implementing such collaboration in SEA context, highlighting the main challenges to be faced by researchers and other target groups involved in OH actions. On this basis, we propose a conceptual framework of OH integration. Throughout 3 components (field-based data management, professional training workshops and higher education, we suggest to develop a new culture of networking involving actors from various disciplines, sectors and levels (from the municipality to the Ministries through a participatory modelling process, fostering synergies and cooperation. This framework could stimulate long-term dialogue process, based on the combination of case studies implementation and capacity building. It aims for implementing both institutional OH dynamics (multi-stakeholders and cross-sectoral and research approaches promoting systems thinking and involving social sciences to follow-up and strengthen collective action.

  4. A framework to promote collective action within the One Health community of practice: Using participatory modelling to enable interdisciplinary, cross-sectoral and multi-level integration.

    Science.gov (United States)

    Binot, Aurelie; Duboz, Raphaël; Promburom, Panomsak; Phimpraphai, Waraphon; Cappelle, Julien; Lajaunie, Claire; Goutard, Flavie Luce; Pinyopummintr, Tanu; Figuié, Muriel; Roger, François Louis

    2015-12-01

    As Southeast Asia (SEA) is characterized by high human and domestic animal densities, growing intensification of trade, drastic land use changes and biodiversity erosion, this region appears to be a hotspot to study complex dynamics of zoonoses emergence and health issues at the Animal-Human-Environment interface. Zoonotic diseases and environmental health issues can have devastating socioeconomic and wellbeing impacts. Assessing and managing the related risks implies to take into account ecological and social dynamics at play, in link with epidemiological patterns. The implementation of a One Health ( OH ) approach in this context calls for improved integration among disciplines and improved cross-sectoral collaboration, involving stakeholders at different levels. For sure, such integration is not achieved spontaneously, implies methodological guidelines and has transaction costs. We explore pathways for implementing such collaboration in SEA context, highlighting the main challenges to be faced by researchers and other target groups involved in OH actions. On this basis, we propose a conceptual framework of OH integration. Throughout 3 components (field-based data management, professional training workshops and higher education), we suggest to develop a new culture of networking involving actors from various disciplines, sectors and levels (from the municipality to the Ministries) through a participatory modelling process, fostering synergies and cooperation. This framework could stimulate long-term dialogue process, based on the combination of case studies implementation and capacity building. It aims for implementing both institutional OH dynamics (multi-stakeholders and cross-sectoral) and research approaches promoting systems thinking and involving social sciences to follow-up and strengthen collective action.

  5. A Registry Framework Enabling Patient-Centred Care.

    Science.gov (United States)

    Bellgard, Matthew I; Napier, Kathryn; Render, Lee; Radochonski, Maciej; Lamont, Leanne; Graham, Caroline; Wilton, Steve D; Fletcher, Sue; Goldblatt, Jack; Hunter, Adam A; Weeramanthri, Tarun

    2015-01-01

    Clinical decisions rely on expert knowledge that draws on quality patient phenotypic and physiological data. In this regard, systems that can support patient-centric care are essential. Patient registries are a key component of patient-centre care and can come in many forms such as disease-specific, recruitment, clinical, contact, post market and surveillance. There are, however, a number of significant challenges to overcome in order to maximise the utility of these information management systems to facilitate improved patient-centred care. Registries need to be harmonised regionally, nationally and internationally. However, the majority are implemented as standalone systems without consideration for data standards or system interoperability. Hence the task of harmonisation can become daunting. Fortunately, there are strategies to address this. In this paper, a disease registry framework is outlined that enables efficient deployment of national and international registries that can be modified dynamically as registry requirements evolve. This framework provides a basis for the development and implementation of data standards and enables patients to seamlessly belong to multiple registries. Other significant advances include the ability for registry curators to create and manage registries themselves without the need to contract software developers, and the concept of a registry description language for ease of registry template sharing.

  6. An Integrated Conceptual Framework for RFID Enabled Healthcare

    Directory of Open Access Journals (Sweden)

    Gaurav Gupta

    2015-12-01

    Full Text Available Radio frequency identification (RFID technology is a wireless communication technology that facilitates automatic identification and data capture without human intervention. Since 2000s, RFID applications in the health care industry are increasing.  RFID has brought many improvements in areas like patient care, patient safety, equipment tracking, resource utilization, processing time reduction and so on. On the other hand, often deployment of RFID is questioned on the issues like high capital investment, technological complexity, and privacy concerns. Exploration of existing literature indicates the presence of works on the topics like asset management, patient management, staff management, institutional advantages, and organizational issues. However, most of the works are focused on a particular issue. Still now, scholarly attempts to integrate all the facades of RFID-enabled healthcare are limited. In this paper, we propose a conceptual framework that represents the scope for implementation of this technology and the various dimensions of RFID-enabled healthcare and demonstrate them in detail. Also, we have discussed the critical issues that can prove to be potential barriers to its successful implementation and current approaches to resolving these. We also discuss some of the regulatory initiatives encouraging its adoption in the healthcare industry. Also, we have highlighted the future research opportunities in this domain.

  7. A Working Framework for Enabling International Science Data System Interoperability

    Science.gov (United States)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  8. Enabling model customization and integration

    Science.gov (United States)

    Park, Minho; Fishwick, Paul A.

    2003-09-01

    Until fairly recently, the idea of dynamic model content and presentation were treated synonymously. For example, if one was to take a data flow network, which captures the dynamics of a target system in terms of the flow of data through nodal operators, then one would often standardize on rectangles and arrows for the model display. The increasing web emphasis on XML, however, suggests that the network model can have its content specified in an XML language, and then the model can be represented in a number of ways depending on the chosen style. We have developed a formal method, based on styles, that permits a model to be specified in XML and presented in 1D (text), 2D, and 3D. This method allows for customization and personalization to exert their benefits beyond e-commerce, to the area of model structures used in computer simulation. This customization leads naturally to solving the bigger problem of model integration - the act of taking models of a scene and integrating them with that scene so that there is only one unified modeling interface. This work focuses mostly on customization, but we address the integration issue in the future work section.

  9. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    Science.gov (United States)

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    . In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...... with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene, and, for the performance evaluation of an atomizer product. In the first case study, the reactor type is where the reactions are thermodynamically limited......, such as, steam reforming and the production of olefins from inexpensive paraffins via dehydrogenation. The generated process model is based on Fickian diffusion model, which is the most widely used to account for the intraparticle mass transfer resistance. The model of the process can help to predict...

  11. A framework for sustainable interorganizational business model

    OpenAIRE

    Neupane, Ganesh Prasad; Haugland, Sven A.

    2016-01-01

    Drawing on literature on business model innovations and sustainability, this paper develops a framework for sustainable interorganizational business models. The aim of the framework is to enhance the sustainability of firms’ business models by enabling firms to create future value by taking into account environmental, social and economic factors. The paper discusses two themes: (1) application of the term sustainability to business model innovation, and (2) implications of integrating sustain...

  12. SDN-Enabled Communication Network Framework for Energy Internet

    Directory of Open Access Journals (Sweden)

    Zhaoming Lu

    2017-01-01

    Full Text Available To support distributed energy generators and improve energy utilization, energy Internet has attracted global research focus. In China, energy Internet has been proposed as an important issue of government and institutes. However, managing a large amount of distributed generators requires smart, low-latency, reliable, and safe networking infrastructure, which cannot be supported by traditional networks in power grids. In order to design and construct smart and flexible energy Internet, we proposed a software defined network framework with both microgrid cluster level and global grid level designed by a hierarchical manner, which will bring flexibility, efficiency, and reliability for power grid networks. Finally, we evaluate and verify the performance of this framework in terms of latency, reliability, and security by both theoretical analysis and real-world experiments.

  13. A framework for WRF to WRF-IBM grid nesting to enable multiscale simulations

    Energy Technology Data Exchange (ETDEWEB)

    Wiersema, David John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Univ. of California, Berkeley, CA (United States); Lundquist, Katherine A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chow, Fotini Katapodes [Univ. of California, Berkeley, CA (United States)

    2016-09-29

    With advances in computational power, mesoscale models, such as the Weather Research and Forecasting (WRF) model, are often pushed to higher resolutions. As the model’s horizontal resolution is refined, the maximum resolved terrain slope will increase. Because WRF uses a terrain-following coordinate, this increase in resolved terrain slopes introduces additional grid skewness. At high resolutions and over complex terrain, this grid skewness can introduce large numerical errors that require methods, such as the immersed boundary method, to keep the model accurate and stable. Our implementation of the immersed boundary method in the WRF model, WRF-IBM, has proven effective at microscale simulations over complex terrain. WRF-IBM uses a non-conforming grid that extends beneath the model’s terrain. Boundary conditions at the immersed boundary, the terrain, are enforced by introducing a body force term to the governing equations at points directly beneath the immersed boundary. Nesting between a WRF parent grid and a WRF-IBM child grid requires a new framework for initialization and forcing of the child WRF-IBM grid. This framework will enable concurrent multi-scale simulations within the WRF model, improving the accuracy of high-resolution simulations and enabling simulations across a wide range of scales.

  14. A Framework for BIM-enabled Life-cycle Information Management of Construction Project

    OpenAIRE

    Xu, n; Ma, Ling; Ding, Lieyun

    2014-01-01

    BIM has been widely used in project management, but on the whole the applications have been scattered and the BIM models have not been deployed throughout the whole project life-cycle. Each participant builds their own BIM, so there is a major problem in how to integrate these dynamic and fragmented data together. In order to solve this problem, this paper focuses on BIM- based life-cycle information management and builds a framework for BIM-enabled life-cycle information management. To organ...

  15. Towards Cache-Enabled, Order-Aware, Ontology-Based Stream Reasoning Framework

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Rui; Praggastis, Brenda L.; Smith, William P.; McGuinness, Deborah L.

    2016-08-16

    While streaming data have become increasingly more popular in business and research communities, semantic models and processing software for streaming data have not kept pace. Traditional semantic solutions have not addressed transient data streams. Semantic web languages (e.g., RDF, OWL) have typically addressed static data settings and linked data approaches have predominantly addressed static or growing data repositories. Streaming data settings have some fundamental differences; in particular, data are consumed on the fly and data may expire. Stream reasoning, a combination of stream processing and semantic reasoning, has emerged with the vision of providing "smart" processing of streaming data. C-SPARQL is a prominent stream reasoning system that handles semantic (RDF) data streams. Many stream reasoning systems including C-SPARQL use a sliding window and use data arrival time to evict data. For data streams that include expiration times, a simple arrival time scheme is inadequate if the window size does not match the expiration period. In this paper, we propose a cache-enabled, order-aware, ontology-based stream reasoning framework. This framework consumes RDF streams with expiration timestamps assigned by the streaming source. Our framework utilizes both arrival and expiration timestamps in its cache eviction policies. In addition, we introduce the notion of "semantic importance" which aims to address the relevance of data to the expected reasoning, thus enabling the eviction algorithms to be more context- and reasoning-aware when choosing what data to maintain for question answering. We evaluate this framework by implementing three different prototypes and utilizing five metrics. The trade-offs of deploying the proposed framework are also discussed.

  16. A Framework for BIM-Enabled Life-Cycle Information Management of Construction Project

    Directory of Open Access Journals (Sweden)

    Xun Xu

    2014-08-01

    Full Text Available BIM has been widely used in project management, but on the whole the applications have been scattered and the BIM models have not been deployed throughout the whole project life-cycle. Each participant builds their own BIM, so there is a major problem in how to integrate these dynamic and fragmented data together. In order to solve this problem, this paper focuses on BIM-based life-cycle information management and builds a framework for BIM-enabled life-cycle information management. To organize the life-cycle information well, the information components and information flow during the project life-cycle are defined. Then, the application of BIM in life-cycle information management is analysed. This framework will provide a unified platform for information management and ensure data integrity.

  17. Geologic Framework Model (GFM2000)

    Energy Technology Data Exchange (ETDEWEB)

    T. Vogt

    2004-08-26

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M&O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in the

  18. Geologic Framework Model (GFM2000)

    International Nuclear Information System (INIS)

    T. Vogt

    2004-01-01

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M and O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in

  19. Sparsity enabled cluster reduced-order models for control

    Science.gov (United States)

    Kaiser, Eurika; Morzyński, Marek; Daviller, Guillaume; Kutz, J. Nathan; Brunton, Bingni W.; Brunton, Steven L.

    2018-01-01

    Characterizing and controlling nonlinear, multi-scale phenomena are central goals in science and engineering. Cluster-based reduced-order modeling (CROM) was introduced to exploit the underlying low-dimensional dynamics of complex systems. CROM builds a data-driven discretization of the Perron-Frobenius operator, resulting in a probabilistic model for ensembles of trajectories. A key advantage of CROM is that it embeds nonlinear dynamics in a linear framework, which enables the application of standard linear techniques to the nonlinear system. CROM is typically computed on high-dimensional data; however, access to and computations on this full-state data limit the online implementation of CROM for prediction and control. Here, we address this key challenge by identifying a small subset of critical measurements to learn an efficient CROM, referred to as sparsity-enabled CROM. In particular, we leverage compressive measurements to faithfully embed the cluster geometry and preserve the probabilistic dynamics. Further, we show how to identify fewer optimized sensor locations tailored to a specific problem that outperform random measurements. Both of these sparsity-enabled sensing strategies significantly reduce the burden of data acquisition and processing for low-latency in-time estimation and control. We illustrate this unsupervised learning approach on three different high-dimensional nonlinear dynamical systems from fluids with increasing complexity, with one application in flow control. Sparsity-enabled CROM is a critical facilitator for real-time implementation on high-dimensional systems where full-state information may be inaccessible.

  20. A FRAMEWORK FOR INTELLIGENT VOICE-ENABLED E-EDUCATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Azeta A. A.

    2009-07-01

    Full Text Available Although the Internet has received significant attention in recent years, voice is still the most convenient and natural way of communicating between human to human or human to computer. In voice applications, users may have different needs which will require the ability of the system to reason, make decisions, be flexible and adapt to requests during interaction. These needs have placed new requirements in voice application development such as use of advanced models, techniques and methodologies which take into account the needs of different users and environments. The ability of a system to behave close to human reasoning is often mentioned as one of the major requirements for the development of voice applications. In this paper, we present a framework for an intelligent voice-enabled e-Education application and an adaptation of the framework for the development of a prototype Course Registration and Examination (CourseRegExamOnline module. This study is a preliminary report of an ongoing e-Education project containing the following modules: enrollment, course registration and examination, enquiries/information, messaging/collaboration, e-Learning and library. The CourseRegExamOnline module was developed using VoiceXML for the voice user interface(VUI, PHP for the web user interface (WUI, Apache as the middle-ware and MySQL database as back-end. The system would offer dual access modes using the VUI and WUI. The framework would serve as a reference model for developing voice-based e-Education applications. The e-Education system when fully developed would meet the needs of students who are normal users and those with certain forms of disabilities such as visual impairment, repetitive strain injury (RSI, etc, that make reading and writing difficult.

  1. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    Science.gov (United States)

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  2. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    Full Text Available Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA. Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  3. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    Science.gov (United States)

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  4. CMAQ Model Evaluation Framework

    Science.gov (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  5. A framework for smartphone-enabled, patient-generated health data analysis

    Directory of Open Access Journals (Sweden)

    Shreya S. Gollamudi

    2016-08-01

    Full Text Available Background: Digital medicine and smartphone-enabled health technologies provide a novel source of human health and human biology data. However, in part due to its intricacies, few methods have been established to analyze and interpret data in this domain. We previously conducted a six-month interventional trial examining the efficacy of a comprehensive smartphone-based health monitoring program for individuals with chronic disease. This included 38 individuals with hypertension who recorded 6,290 blood pressure readings over the trial. Methods: In the present study, we provide a hypothesis testing framework for unstructured time series data, typical of patient-generated mobile device data. We used a mixed model approach for unequally spaced repeated measures using autoregressive and generalized autoregressive models, and applied this to the blood pressure data generated in this trial. Results: We were able to detect, roughly, a 2 mmHg decrease in both systolic and diastolic blood pressure over the course of the trial despite considerable intra- and inter-individual variation. Furthermore, by supplementing this finding by using a sequential analysis approach, we observed this result over three months prior to the official study end—highlighting the effectiveness of leveraging the digital nature of this data source to form timely conclusions. Conclusions: Health data generated through the use of smartphones and other mobile devices allow individuals the opportunity to make informed health decisions, and provide researchers the opportunity to address innovative health and biology questions. The hypothesis testing framework we present can be applied in future studies utilizing digital medicine technology or implemented in the technology itself to support the quantified self.

  6. Demand Response Advanced Controls Framework and Assessment of Enabling Technology Costs

    Energy Technology Data Exchange (ETDEWEB)

    Potter, Jennifer; Cappers, Peter

    2017-08-28

    The Demand Response Advanced Controls Framework and Assessment of Enabling Technology Costs research describe a variety of DR opportunities and the various bulk power system services they can provide. The bulk power system services are mapped to a generalized taxonomy of DR “service types”, which allows us to discuss DR opportunities and bulk power system services in fewer yet broader categories that share similar technological requirements which mainly drive DR enablement costs. The research presents a framework for the costs to automate DR and provides descriptions of the various elements that drive enablement costs. The report introduces the various DR enabling technologies and end-uses, identifies the various services that each can provide to the grid and provides the cost assessment for each enabling technology. In addition to a report, this research includes a Demand Response Advanced Controls Database and User Manual. They are intended to provide users with the data that underlies this research and instructions for how to use that database more effectively and efficiently.

  7. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  8. A Market Framework for Enabling Electric Vehicles Flexibility Procurement at the Distribution Level Considering Grid Constraints

    DEFF Research Database (Denmark)

    Gadea, Ana; Marinelli, Mattia; Zecchino, Antonio

    2018-01-01

    In a context of extensive electrification of the transport sector, the use of flexibility services from electric vehicles (EVs) is becoming of paramount importance. This paper defines a market framework for enabling EVs flexibility at the distribution level, considering grid constraints. The main...... objective is to establish an adequate incentive system and proceed with an evaluation of EVs grid support for both users and DSOs, benchmarking it against the typical reinforcement solution. To exploit this framework, a billing process based on a two-price system is proposed for the controlled EV charging....... The derived methodology is applied to a piece of semi-urban Danish distribution grid consisting of 42 customers. The service remuneration spans from 16 €/year to 51 €/year per customer, depending on the incentive scheme, and avoids a standard reinforcement of approximately 6200 €/year. It is demonstrated...

  9. Towards an Integrated Framework for SDGs: Ultimate and Enabling Goals for the Case of Energy

    Directory of Open Access Journals (Sweden)

    Tetsuro Yoshida

    2013-09-01

    Full Text Available Discussions on how to define, design, and implement sustainable development goals (SDG have taken center stage in the United Nations since the Rio+20 summit. Energy is one of the issues that enjoyed consensus, before and after Rio, as an important area for SDGs to address. Many proposals have been put forward on how SDGs should be formulated and what areas they should cover, but there have been few attempts to develop a generic integrated framework within which diverse areas can be accommodated and treated in a coherent way. The purpose of this paper is to develop such a framework for SDGs and to demonstrate its application by elaborating specific target areas for the energy sector. Based on a review and integration of global debates around SDG and energy, the framework puts human wellbeing at the center of the agenda, with the supporting resource base and global public goods forming additional tiers. A complementary set of enabling goals is suggested with four layers: capacity & knowledge, governance & institutions, public policy, and investment & finance. An energy SDG is elaborated to illustrate the application of the framework. The illustrative SDG architecture for energy includes eight target areas: basic energy access, energy for economic development, sufficiency, renewable supply, efficiency, infrastructure, greenhouse gas emissions and security. These target areas are relevant for energy for all countries, but depending on national circumstances such as levels of development, the relative emphasis will be different between countries, and over time.

  10. Modeling-Enabled Systems Nutritional Immunology

    Directory of Open Access Journals (Sweden)

    Meghna eVerma

    2016-02-01

    Full Text Available This review highlights the fundamental role of nutrition in the maintenance of health, the immune response and disease prevention. Emerging global mechanistic insights in the field of nutritional immunology cannot be gained through reductionist methods alone or by analyzing a single nutrient at a time. We propose to investigate nutritional immunology as a massively interacting system of interconnected multistage and multiscale networks that encompass hidden mechanisms by which nutrition, microbiome, metabolism, genetic predisposition and the immune system interact to delineate health and disease. The review sets an unconventional path to applying complex science methodologies to nutritional immunology research, discovery and development through ‘use cases’ centered around the impact of nutrition on the gut microbiome and immune responses. Our systems nutritional immunology analyses, that include modeling and informatics methodologies in combination with pre-clinical and clinical studies, have the potential to discover emerging systems-wide properties at the interface of the immune system, nutrition, microbiome, and metabolism.

  11. Frameworks for understanding and describing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian; Roslender, Robin

    2014-01-01

    This chapter provides in a chronological fashion an introduction to six frameworks that one can apply to describing, understanding and also potentially innovating business models. These six frameworks have been chosen carefully as they represent six very different perspectives on business models ...... Maps (2001) • Intellectual Capital Statements (2003) • Chesbrough’s framework for Open Business Models (2006) • Business Model Canvas (2008)...

  12. Software to Enable Modeling & Simulation as a Service

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a Modeling and Simulation as a Service (M&SaaS) software service infrastructure to enable most modeling and simulation (M&S) activities to be...

  13. A Unified Framework for Systematic Model Improvement

    DEFF Research Database (Denmark)

    Kristensen, Niels Rode; Madsen, Henrik; Jørgensen, Sten Bay

    2003-01-01

    A unified framework for improving the quality of continuous time models of dynamic systems based on experimental data is presented. The framework is based on an interplay between stochastic differential equation (SDE) modelling, statistical tests and multivariate nonparametric regression...

  14. Creating an enabling environment for adolescent sexual and reproductive health: a framework and promising approaches.

    Science.gov (United States)

    Svanemyr, Joar; Amin, Avni; Robles, Omar J; Greene, Margaret E

    2015-01-01

    This article provides a conceptual framework and points out the key elements for creating enabling environments for adolescent sexual and reproductive health (ASRH). An ecological framework is applied to organize the key elements of enabling environments for ASRH. At the individual level, strategies that are being implemented and seem promising are those that empower girls, build their individual assets, and create safe spaces. At the relationship level, strategies that are being implemented and seem promising include efforts to build parental support and communication as well as peer support networks. At the community level, strategies to engage men and boys and the wider community to transform gender and other social norms are being tested and may hold promise. Finally, at the broadest societal level, efforts to promote laws and policies that protect and promote human rights and address societal awareness about ASRH issues, including through mass media approaches, need to be considered. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  15. Requirements for plug and play information infrastructure frameworks and architectures to enable virtual enterprises

    Science.gov (United States)

    Bolton, Richard W.; Dewey, Allen; Horstmann, Paul W.; Laurentiev, John

    1997-01-01

    This paper examines the role virtual enterprises will have in supporting future business engagements and resulting technology requirements. Two representative end-user scenarios are proposed that define the requirements for 'plug-and-play' information infrastructure frameworks and architectures necessary to enable 'virtual enterprises' in US manufacturing industries. The scenarios provide a high- level 'needs analysis' for identifying key technologies, defining a reference architecture, and developing compliant reference implementations. Virtual enterprises are short- term consortia or alliances of companies formed to address fast-changing opportunities. Members of a virtual enterprise carry out their tasks as if they all worked for a single organization under 'one roof', using 'plug-and-play' information infrastructure frameworks and architectures to access and manage all information needed to support the product cycle. 'Plug-and-play' information infrastructure frameworks and architectures are required to enhance collaboration between companies corking together on different aspects of a manufacturing process. This new form of collaborative computing will decrease cycle-time and increase responsiveness to change.

  16. Geologic Framework Model Analysis Model Report

    International Nuclear Information System (INIS)

    Clayton, R.

    2000-01-01

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and

  17. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  18. Enabling Sustainability: Hierarchical Need-Based Framework for Promoting Sustainable Data Infrastructure in Developing Countries

    Directory of Open Access Journals (Sweden)

    David O. Yawson

    2009-11-01

    Full Text Available The paper presents thoughts on Sustainable Data Infrastructure (SDI development, and its user requirements bases. It brings Maslow's motivational theory to the fore, and proposes it as a rationalization mechanism for entities (mostly governmental that aim at realizing SDI. Maslow's theory, though well-known, is somewhat new in geospatial circles; this is where the novelty of the paper resides. SDI has been shown to enable and aid development in diverse ways. However, stimulating developing countries to appreciate the utility of SDI, implement, and use SDI in achieving sustainable development has proven to be an imposing challenge. One of the key reasons for this could be the absence of a widely accepted psychological theory to drive needs assessment and intervention design for the purpose of SDI development. As a result, it is reasonable to explore Maslow’s theory of human motivation as a psychological theory for promoting SDI in developing countries. In this article, we review and adapt Maslow’s hierarchy of needs as a framework for the assessment of the needs of developing nations. The paper concludes with the implications of this framework for policy with the view to stimulating the implementation of SDI in developing nations.

  19. GeoSpark SQL: An Effective Framework Enabling Spatial Queries on Spark

    Directory of Open Access Journals (Sweden)

    Zhou Huang

    2017-09-01

    Full Text Available In the era of big data, Internet-based geospatial information services such as various LBS apps are deployed everywhere, followed by an increasing number of queries against the massive spatial data. As a result, the traditional relational spatial database (e.g., PostgreSQL with PostGIS and Oracle Spatial cannot adapt well to the needs of large-scale spatial query processing. Spark is an emerging outstanding distributed computing framework in the Hadoop ecosystem. This paper aims to address the increasingly large-scale spatial query-processing requirement in the era of big data, and proposes an effective framework GeoSpark SQL, which enables spatial queries on Spark. On the one hand, GeoSpark SQL provides a convenient SQL interface; on the other hand, GeoSpark SQL achieves both efficient storage management and high-performance parallel computing through integrating Hive and Spark. In this study, the following key issues are discussed and addressed: (1 storage management methods under the GeoSpark SQL framework, (2 the spatial operator implementation approach in the Spark environment, and (3 spatial query optimization methods under Spark. Experimental evaluation is also performed and the results show that GeoSpark SQL is able to achieve real-time query processing. It should be noted that Spark is not a panacea. It is observed that the traditional spatial database PostGIS/PostgreSQL performs better than GeoSpark SQL in some query scenarios, especially for the spatial queries with high selectivity, such as the point query and the window query. In general, GeoSpark SQL performs better when dealing with compute-intensive spatial queries such as the kNN query and the spatial join query.

  20. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    Science.gov (United States)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic

  1. Smart Cities as Organizational Fields: A Framework for Mapping Sustainability-Enabling Configurations

    Directory of Open Access Journals (Sweden)

    Paul Pierce

    2017-08-01

    Full Text Available Despite the impressive growth of smart city initiatives worldwide, an organizational theory of smart city has yet to be developed, and we lack models addressing the unprecedented organizational and management challenges that emerge in smart city contexts. Traditional models are often of little use, because smart cities pursue different goals than traditional organizations, are based on networked, cross-boundary activity systems, rely on distributed innovation processes, and imply adaptive policy-making. Complex combinations of factors may lead to vicious or virtuous cycles in smart city initiatives, but we know very little about how these factors may be identified and mapped. Based on an inductive study of a set of primary and secondary sources, we develop a framework for the configurational analysis of smart cities viewed as place-specific organizational fields. This framework identifies five key dimensions in the configurations of smart city fields; these five dimensions are mapped through five sub-frameworks, which can be used both separately as well as for an integrated analysis. Our contribution is conceived to support longitudinal studies, natural experiments and comparative analyses on smart city fields, and to improve our understanding of how different combinations of factors affect the capability of smart innovations to translate into city resilience, sustainability and quality of life. In addition, our results suggest that new forms of place-based entrepreneurship constitute the engine that allows for the dynamic collaboration between government, citizens and research centers in successful smart city organizational fields.

  2. Green communication: The enabler to multiple business models

    DEFF Research Database (Denmark)

    Lindgren, Peter; Clemmensen, Suberia; Taran, Yariv

    2010-01-01

    Companies stand at the forefront of a new business model reality with new potentials - that will change their basic understanding and practice of running their business models radically. One of the drivers to this change is green communication, its strong relation to green business models and its...... possibility to enable lower energy consumption. This paper shows how green communication enables innovation of green business models and multiple business models running simultaneously in different markets to different customers.......Companies stand at the forefront of a new business model reality with new potentials - that will change their basic understanding and practice of running their business models radically. One of the drivers to this change is green communication, its strong relation to green business models and its...

  3. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  4. A Modeling Framework for Improved Agricultural Water Supply Forecasting

    Science.gov (United States)

    Leavesley, G. H.; David, O.; Garen, D. C.; Lea, J.; Marron, J. K.; Pagano, T. C.; Perkins, T. R.; Strobel, M. L.

    2008-12-01

    The National Water and Climate Center (NWCC) of the USDA Natural Resources Conservation Service is moving to augment seasonal, regression-equation based water supply forecasts with distributed-parameter, physical process models enabling daily, weekly, and seasonal forecasting using an Ensemble Streamflow Prediction (ESP) methodology. This effort involves the development and implementation of a modeling framework, and associated models and tools, to provide timely forecasts for use by the agricultural community in the western United States where snowmelt is a major source of water supply. The framework selected to support this integration is the USDA Object Modeling System (OMS). OMS is a Java-based modular modeling framework for model development, testing, and deployment. It consists of a library of stand-alone science, control, and database components (modules), and a means to assemble selected components into a modeling package that is customized to the problem, data constraints, and scale of application. The framework is supported by utility modules that provide a variety of data management, land unit delineation and parameterization, sensitivity analysis, calibration, statistical analysis, and visualization capabilities. OMS uses an open source software approach to enable all members of the scientific community to collaboratively work on addressing the many complex issues associated with the design, development, and application of distributed hydrological and environmental models. A long-term goal in the development of these water-supply forecasting capabilities is the implementation of an ensemble modeling approach. This would provide forecasts using the results of multiple hydrologic models run on each basin.

  5. Cytoview: Development of a cell modelling framework

    Indian Academy of Sciences (India)

    2007-07-06

    Jul 6, 2007 ... Home; Journals; Journal of Biosciences; Volume 32; Issue 5. Cytoview: Development of a cell modelling framework ... The framework serves as a first step in integrating different levels of data available for a biological cell and has the potential to lead to development of computational models in our pursuit to ...

  6. Enabling pathways to health equity: developing a framework for implementing social capital in practice.

    Science.gov (United States)

    Putland, Christine; Baum, Fran; Ziersch, Anna; Arthurson, Kathy; Pomagalska, Dorota

    2013-05-29

    relationship requires long term vision, endorsement for cross-sectoral work, well-developed relationships and theoretical and practical knowledge. Attention to the practical application of social capital theory shows that community projects require structural support in their efforts to improve health and wellbeing and reduce health inequities. Sound community development techniques are essential but do not operate independently from frameworks and policies at the highest levels of government. Recognition of the interdependence of policy and practice will enable government to achieve these goals more effectively.

  7. Exploring How Usage-Focused Business Models Enable Circular Economy through Digital Technologies

    Directory of Open Access Journals (Sweden)

    Gianmarco Bressanelli

    2018-02-01

    Full Text Available Recent studies advocate that digital technologies are key enabling factors for the introduction of servitized business models. At the same time, these technologies support the implementation of the circular economy (CE paradigm into businesses. Despite this general agreement, the literature still overlooks how digital technologies enable such a CE transition. To fill the gap, this paper develops a conceptual framework, based on the literature and a case study of a company implementing a usage-focused servitized business model in the household appliance industry. This study focuses on the Internet of Things (IoT, Big Data, and analytics, and identifies eight specific functionalities enabled by such technologies (improving product design, attracting target customers, monitoring and tracking product activity, providing technical support, providing preventive and predictive maintenance, optimizing the product usage, upgrading the product, enhancing renovation and end-of-life activities. By investigating how these functionalities affect three CE value drivers (increasing resource efficiency, extending lifespan, and closing the loop, the conceptual framework developed in this paper advances knowledge about the role of digital technologies as an enabler of the CE within usage-focused business models. Finally, this study shows how digital technologies help overcome the drawback of usage-focused business models for the adoption of CE pointed out by previous literature.

  8. A Simulation and Modeling Framework for Space Situational Awareness

    Science.gov (United States)

    Olivier, S.

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. This framework includes detailed models for threat scenarios, signatures, sensors, observables and knowledge extraction algorithms. The framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the details of the modeling and simulation framework, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical and infra-red brightness calculations, generic radar system models, generic optical and infra-red system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The specific modeling of the Space Surveillance Network is performed in collaboration with the Air Force Space Command Space Control Group. We will demonstrate the use of this integrated simulation and modeling framework on specific threat scenarios, including space debris and satellite maneuvers, and we will examine the results of case studies involving the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  9. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.

    2012-01-01

    a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we......Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv...

  10. Plasma Modeling Enabled Technology Development Empowered by Fundamental Scattering Data

    Science.gov (United States)

    Kushner, Mark J.

    2016-05-01

    Technology development increasingly relies on modeling to speed the innovation cycle. This is particularly true for systems using low temperature plasmas (LTPs) and their role in enabling energy efficient processes with minimal environmental impact. In the innovation cycle, LTP modeling supports investigation of fundamental processes that seed the cycle, optimization of newly developed technologies, and prediction of performance of unbuilt systems for new applications. Although proof-of-principle modeling may be performed for idealized systems in simple gases, technology development must address physically complex systems that use complex gas mixtures that now may be multi-phase (e.g., in contact with liquids). The variety of fundamental electron and ion scattering, and radiation transport data (FSRD) required for this modeling increases as the innovation cycle progresses, while the accuracy required of that data depends on the intended outcome. In all cases, the fidelity, depth and impact of the modeling depends on the availability of FSRD. Modeling and technology development are, in fact, empowered by the availability and robustness of FSRD. In this talk, examples of the impact of and requirements for FSRD in the innovation cycle enabled by plasma modeling will be discussed using results from multidimensional and global models. Examples of fundamental studies and technology optimization will focus on microelectronics fabrication and on optically pumped lasers. Modeling of systems as yet unbuilt will address the interaction of atmospheric pressure plasmas with liquids. Work supported by DOE Office of Fusion Energy Science and the National Science Foundation.

  11. Crystallization Kinetics within a Generic Modeling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist V.

    2014-01-01

    of employing a well-structured model library for storage, use/reuse, and analysis of the kinetic models are highlighted. Examples illustrating the application of the modeling framework for kinetic model discrimination related to simulation of specific crystallization scenarios and for kinetic model parameter......A new and extended version of a generic modeling framework for analysis and design of crystallization operations is presented. The new features of this framework are described, with focus on development, implementation, identification, and analysis of crystallization kinetic models. Issues related...... to the modeling of various kinetic phenomena like nucleation, growth, agglomeration, and breakage are discussed in terms of model forms, model parameters, their availability and/or estimation, and their selection and application for specific crystallization operational scenarios under study. The advantages...

  12. The Knowledge-Inducing Culture — An Integrative Framework of Cultural Enablers of Knowledge Management

    OpenAIRE

    Wei Zheng

    2009-01-01

    Organisational cultural factors are key to knowledge management effectiveness. Existing research on cultural factors that facilitate knowledge management has been fragmented. This paper proposes a theoretical framework that integrates existing research on cultural factors that influence knowledge management. This framework incorporates three cultural categories: cultural factors related to the orientation to knowledge, cultural factors related to the orientation to people and cultural factors...

  13. Towards A Framework For ICT-Enabled Materials Management In Complex Projects

    Directory of Open Access Journals (Sweden)

    N. B. Kasim

    2011-10-01

    Full Text Available This paper describes a research project, aimed at developing a system to integrate RFID-based materials management with resources modelling in project management to improve on-site materials tracking and inventory management processes. In order to develop the system, a comprehensive literature review and exploratory case studies were conducted to investigate current practices, problems, implementation of ICT and potential use of emerging technologies (such as RFID and wireless technologies in overcoming the logistical difficulties associated with materials management. An initial assessment revealed that there is a potential to improve the tracking and management of materials using modern ICT, thus will enhance the operational efficiency of the project delivery process. Moreover, sophisticated technologies such as wireless systems and tagging are not generally used to overcome human error in materials identification and the space constraints inherent in many projects. This paper concludes the finding from case studies for developing a real-time materials tracking framework to support construction professional in handling materials more effectively.

  14. ARCHITECTURES AND ALGORITHMS FOR COGNITIVE NETWORKS ENABLED BY QUALITATIVE MODELS

    DEFF Research Database (Denmark)

    Balamuralidhar, P.

    2013-01-01

    Complexity of communication networks is ever increasing and getting complicated by their heterogeneity and dynamism. Traditional techniques are facing challenges in network performance management. Cognitive networking is an emerging paradigm to make networks more intelligent, thereby overcoming...... of the cognitive engine that incorporates a context space based information structure to its knowledge model. I propose a set of guiding principles behind a cognitive system to be autonomic and use them with additional requirements to build a detailed architecture for the cognitive engine. I define a context space...... structure integrating various information structures that are required for the knowledge model. Use graphical models towards representing and reasoning about context space is a direction followed here. Specifically I analyze the framework of qualitative models for their suitability to represent the dynamic...

  15. GeoPro: Technology to Enable Scientific Modeling

    International Nuclear Information System (INIS)

    C. Juan

    2004-01-01

    Development of the ground-water flow model for the Death Valley Regional Groundwater Flow System (DVRFS) required integration of numerous supporting hydrogeologic investigations. The results from recharge, discharge, hydraulic properties, water level, pumping, model boundaries, and geologic studies were integrated to develop the required conceptual and 3-D framework models, and the flow model itself. To support the complex modeling process and the needs of the multidisciplinary DVRFS team, a hardware and software system called GeoPro (Geoscience Knowledge Integration Protocol) was developed. A primary function of GeoPro is to manage the large volume of disparate data compiled for the 100,000-square-kilometer area of southern Nevada and California. The data are primarily from previous investigations and regional flow models developed for the Nevada Test Site and Yucca Mountain projects. GeoPro utilizes relational database technology (Microsoft SQL Server(trademark)) to store and manage these tabular point data, groundwater flow model ASCII data, 3-D hydrogeologic framework data, 2-D and 2.5-D GIS data, and text documents. Data management consists of versioning, tracking, and reporting data changes as multiple users access the centralized database. GeoPro also supports the modeling process by automating the routine data transformations required to integrate project software. This automation is also crucial to streamlining pre- and post-processing of model data during model calibration. Another function of GeoPro is to facilitate the dissemination and use of the model data and results through web-based documents by linking and allowing access to the underlying database and analysis tools. The intent is to convey to end-users the complex flow model product in a manner that is simple, flexible, and relevant to their needs. GeoPro is evolving from a prototype system to a production-level product. Currently the DVRFS pre- and post-processing modeling tools are being re

  16. BIM-Enabled Conceptual Modelling and Representation of Building Circulation

    Directory of Open Access Journals (Sweden)

    Jin Kook Lee

    2014-08-01

    Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC's schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs' schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.

  17. A useful framework for optimal replacement models

    International Nuclear Information System (INIS)

    Aven, Terje; Dekker, Rommert

    1997-01-01

    In this note we present a general framework for optimization of replacement times. It covers a number of models, including various age and block replacement models, and allows a uniform analysis for all these models. A relation to the marginal cost concept is described

  18. Spatially enabling the Global Framework for Climate Services: Reviewing geospatial solutions to efficiently share and integrate climate data & information

    Directory of Open Access Journals (Sweden)

    Gregory Giuliani

    2017-12-01

    Considering that climate data is part of the broader Earth observation and geospatial data domain, the aim of this paper is to review the state-of-the-art geospatial technologies that can support the delivery of efficient and effective climate services, and enhancing the value chain of climate data in support of the objectives of the Global Framework for Climate Services. The major benefit of spatially-enabling climate services is that it brings interoperability along the entire climate data value chain. It facilitates storing, visualizing, accessing, processing/analyzing, and integrating climate data and information and enables users to create value-added products and services.

  19. A Modular Swarm Optimization Framework Enabling Multi-Vehicle Coordinated Path Planning, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The advancement of Unmanned Aerial Systems (UAS) with computing power and communications hardware has enabled an increased capability set for multi-vehicle...

  20. A Modular Swarm Optimization Framework Enabling Multi-Vehicle Coordinated Path Planning, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The advancement of Unmanned Aerial Systems (UAS) with computing power and communications hardware has enabled an increased capability set for multi-vehicle...

  1. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin

    2010-01-01

    Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model...... and check the running status of the system, which offers a debugging capability on a higher level of abstraction. The framework intends to contribute a tool to the Eclipse society, especially suitable for model-driven development of embedded systems....

  2. Cooperative cognitive radio networking system model, enabling techniques, and performance

    CERN Document Server

    Cao, Bin; Mark, Jon W

    2016-01-01

    This SpringerBrief examines the active cooperation between users of Cooperative Cognitive Radio Networking (CCRN), exploring the system model, enabling techniques, and performance. The brief provides a systematic study on active cooperation between primary users and secondary users, i.e., (CCRN), followed by the discussions on research issues and challenges in designing spectrum-energy efficient CCRN. As an effort to shed light on the design of spectrum-energy efficient CCRN, they model the CCRN based on orthogonal modulation and orthogonally dual-polarized antenna (ODPA). The resource allocation issues are detailed with respect to both models, in terms of problem formulation, solution approach, and numerical results. Finally, the optimal communication strategies for both primary and secondary users to achieve spectrum-energy efficient CCRN are analyzed.

  3. DEFINE: A Service-Oriented Dynamically Enabling Function Model

    Directory of Open Access Journals (Sweden)

    Tan Wei-Yi

    2017-01-01

    In this paper, we introduce an innovative Dynamically Enable Function In Network Equipment (DEFINE to allow tenant get the network service quickly. First, DEFINE decouples an application into different functional components, and connects these function components in a reconfigurable method. Second, DEFINE provides a programmable interface to the third party, who can develop their own processing modules according to their own needs. To verify the effectiveness of this model, we set up an evaluating network with a FPGA-based OpenFlow switch prototype, and deployed several applications on it. Our results show that DEFINE has excellent flexibility and performance.

  4. Futures Business Models for an IoT Enabled Healthcare Sector: A Causal Layered Analysis Perspective

    Directory of Open Access Journals (Sweden)

    Julius Francis Gomes

    2016-12-01

    Full Text Available Purpose: To facilitate futures business research by proposing a novel way to combine business models as a conceptual tool with futures research techniques. Design: A futures perspective is adopted to foresight business models of the Internet of Things (IoT enabled healthcare sector by using business models as a futures business research tool. In doing so, business models is coupled with one of the most prominent foresight methodologies, Causal Layered Analysis (CLA. Qualitative analysis provides deeper understanding of the phenomenon through the layers of CLA; litany, social causes, worldview and myth. Findings: It is di cult to predict the far future for a technology oriented sector like healthcare. This paper presents three scenarios for short-, medium- and long-term future. Based on these scenarios we also present a set of business model elements for different future time frames. This paper shows a way to combine business models with CLA, a foresight methodology; in order to apply business models in futures business research. Besides offering early results for futures business research, this study proposes a conceptual space to work with individual business models for managerial stakeholders. Originality / Value: Much research on business models has offered conceptualization of the phenomenon, innovation through business model and transformation of business models. However, existing literature does not o er much on using business model as a futures research tool. Enabled by futures thinking, we collected key business model elements and building blocks for the futures market and ana- lyzed them through the CLA framework.

  5. Supply chain risk management enablers - A framework development through systematic review of the literature from 2000 to 2015

    Directory of Open Access Journals (Sweden)

    Kilubi, I.

    2015-08-01

    Full Text Available The present paper delivers a robust and systematic literature review (SLR on supply chain risk management (SCRM with the purpose to a review and analyse the literature concerning definitions and research methodologies applied, to b develop a classificatory framework which clusters existing enablers on SCRM, and to c examine the linkage between SCRM and performance. The findings reveal that not only is SCRM loosely defined, but that there are various fragmented supply chain risks enablers and that there is a strong need for a clear terminology for its building enablers. In addition to that, the review points to a lack of empirical confirmation concerning the connection between SCRM and performance. This paper contributes an overview of 80 peer-reviewed journal articles on SCRM from 2000 to the beginning of 2015. We offer an overarching definition of SCRM, synthesise and assemble the numerous enablers into preventive and responsive strategies by means of a conceptual framework. Moreover, indicating the social network theory (SNT as a potential theoretical foundation for SCRM, we further contribute to the supply chain management (SCM literature by providing propositions that guide future research.

  6. Enabling frameworks for low-carbon technology transfer to small emerging economies: Analysis of ten case studies in Chile

    International Nuclear Information System (INIS)

    Pueyo, Ana

    2013-01-01

    Technology transfer is crucial to reduce the carbon intensity of developing countries. Enabling frameworks need to be in place to allow foreign technologies to flow, to be absorbed and to bring about technological change in the recipient country. This paper contributes to identifying these enabling factors by analysing 10 case studies of low-carbon technology transfer processes based in Chile. Our findings show the importance of strong economic and institutional fundamentals, a sound knowledge base, a sizable and stable demand and a functioning local industry. Policy recommendations are derived to improve the penetration of foreign low-carbon technologies in developing countries, focusing on the particularities of small and medium emerging economies. - Highlights: ► We analyse 10 case studies of low carbon technology transfer to Chile. ► We identify enablers of technology transfer to developing countries. ► We provide policy recommendations focusing on small and medium economies.

  7. Driver Performance Model: 1. Conceptual Framework

    National Research Council Canada - National Science Library

    Heimerl, Joseph

    2001-01-01

    ...'. At the present time, no such comprehensive model exists. This report discusses a conceptual framework designed to encompass the relationships, conditions, and constraints related to direct, indirect, and remote modes of driving and thus provides a guide or 'road map' for the construction and creation of a comprehensive driver performance model.

  8. Calibration in a Bayesian modelling framework

    NARCIS (Netherlands)

    Jansen, M.J.W.; Hagenaars, T.H.J.

    2004-01-01

    Bayesian statistics may constitute the core of a consistent and comprehensive framework for the statistical aspects of modelling complex processes that involve many parameters whose values are derived from many sources. Bayesian statistics holds great promises for model calibration, provides the

  9. Cytoview: Development of a cell modelling framework

    Indian Academy of Sciences (India)

    2007-07-06

    Jul 6, 2007 ... Here we report a framework to model various aspects of a cell and integrate knowledge encoded at different levels of abstraction, with cell morphologies at one end to atomic structures at the other. The different issues that have been addressed are ontologies, feature description and model building.

  10. A Volunteered Geographic Information Framework to Enable Bottom-Up Disaster Management Platforms

    Directory of Open Access Journals (Sweden)

    Mohammad Ebrahim Poorazizi

    2015-08-01

    Full Text Available Recent disasters, such as the 2010 Haiti earthquake, have drawn attention to the potential role of citizens as active information producers. By using location-aware devices such as smartphones to collect geographic information in the form of geo-tagged text, photos, or videos, and sharing this information through online social media, such as Twitter, citizens create Volunteered Geographic Information (VGI. To effectively use this information for disaster management, we developed a VGI framework for the discovery of VGI. This framework consists of four components: (i a VGI brokering module to provide a standard service interface to retrieve VGI from multiple resources based on spatial, temporal, and semantic parameters; (ii a VGI quality control component, which employs semantic filtering and cross-referencing techniques to evaluate VGI; (iii a VGI publisher module, which uses a service-based delivery mechanism to disseminate VGI, and (iv a VGI discovery component to locate, browse, and query metadata about available VGI datasets. In a case study we employed a FOSS (Free and Open Source Software strategy, open standards/specifications, and free/open data to show the utility of the framework. We demonstrate that the framework can facilitate data discovery for disaster management. The addition of quality metrics and a single aggregated source of relevant crisis VGI will allow users to make informed policy choices that could save lives, meet basic humanitarian needs earlier, and perhaps limit environmental and economic damage.

  11. SWAN-Fly : A flexible cloud-enabled framework for context-aware applications in smartphones

    NARCIS (Netherlands)

    Bharath Das, R.; van Halteren, A.T.; Bal, H.E.

    2016-01-01

    Smartphones are equipped with various hardware sensors to enrich the user experience. SWAN is a middleware framework that supports easy collection and processing of sensor data. However, the limited resources of the smartphones prevent the apps from supporting big data applications that need to

  12. Advancing a framework to enable characterization and evaluation of data streams useful for biosurveillance.

    Directory of Open Access Journals (Sweden)

    Kristen J Margevicius

    Full Text Available In recent years, biosurveillance has become the buzzword under which a diverse set of ideas and activities regarding detecting and mitigating biological threats are incorporated depending on context and perspective. Increasingly, biosurveillance practice has become global and interdisciplinary, requiring information and resources across public health, One Health, and biothreat domains. Even within the scope of infectious disease surveillance, multiple systems, data sources, and tools are used with varying and often unknown effectiveness. Evaluating the impact and utility of state-of-the-art biosurveillance is, in part, confounded by the complexity of the systems and the information derived from them. We present a novel approach conceptualizing biosurveillance from the perspective of the fundamental data streams that have been or could be used for biosurveillance and to systematically structure a framework that can be universally applicable for use in evaluating and understanding a wide range of biosurveillance activities. Moreover, the Biosurveillance Data Stream Framework and associated definitions are proposed as a starting point to facilitate the development of a standardized lexicon for biosurveillance and characterization of currently used and newly emerging data streams. Criteria for building the data stream framework were developed from an examination of the literature, analysis of information on operational infectious disease biosurveillance systems, and consultation with experts in the area of biosurveillance. To demonstrate utility, the framework and definitions were used as the basis for a schema of a relational database for biosurveillance resources and in the development and use of a decision support tool for data stream evaluation.

  13. Advancing a framework to enable characterization and evaluation of data streams useful for biosurveillance.

    Science.gov (United States)

    Margevicius, Kristen J; Generous, Nicholas; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    In recent years, biosurveillance has become the buzzword under which a diverse set of ideas and activities regarding detecting and mitigating biological threats are incorporated depending on context and perspective. Increasingly, biosurveillance practice has become global and interdisciplinary, requiring information and resources across public health, One Health, and biothreat domains. Even within the scope of infectious disease surveillance, multiple systems, data sources, and tools are used with varying and often unknown effectiveness. Evaluating the impact and utility of state-of-the-art biosurveillance is, in part, confounded by the complexity of the systems and the information derived from them. We present a novel approach conceptualizing biosurveillance from the perspective of the fundamental data streams that have been or could be used for biosurveillance and to systematically structure a framework that can be universally applicable for use in evaluating and understanding a wide range of biosurveillance activities. Moreover, the Biosurveillance Data Stream Framework and associated definitions are proposed as a starting point to facilitate the development of a standardized lexicon for biosurveillance and characterization of currently used and newly emerging data streams. Criteria for building the data stream framework were developed from an examination of the literature, analysis of information on operational infectious disease biosurveillance systems, and consultation with experts in the area of biosurveillance. To demonstrate utility, the framework and definitions were used as the basis for a schema of a relational database for biosurveillance resources and in the development and use of a decision support tool for data stream evaluation.

  14. Computer-aided modeling framework – a generic modeling template

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    This work focuses on the development of a computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured on workflows for different modeling tasks. The overall objective is to support model developers and users to generate ....... The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene...

  15. Framework Architecture Enabling an Agent-Based Inter-Company Integration with XML

    Directory of Open Access Journals (Sweden)

    Klement Fellner

    2000-11-01

    Full Text Available More and more cooperating companies utilize the World Wide Web (WWW to federate and further integrate their heterogeneous business application systems. At the same time, innovative business strategies, like virtual organizations, supply chain management or one-to-one marketing as well as trendsetting competitive strategies, like mass customisation are realisable. Both, the necessary integration and the innovative concepts are demanding software supporting automation of communication as well as coordination across system boundaries. In this paper, we describe a framework architecture for intercompany integration of business processes based on commonly accepted and (partially standardized concepts and techniques. Further on, it is shown how the framework architecture helps to automate procurement processes and how a cost-saving black-box re-use is achieved following a component oriented implementation paradigm.

  16. Business Model Concept: An Integrative Framework Proposal

    Directory of Open Access Journals (Sweden)

    Marko Peric

    2017-09-01

    Full Text Available Every firm employs a particular business model seeking competitive advantage. However, this pursuit is difficult, and sometimes unsuccessful. The reasons for failure should be sought in the managers’ lack of understanding of their organisations’ business models, their unique building blocks, and the potential that they have. To help managers better understand business models, this paper reviews the extant literature and identifies the elements of business models cited therein. Further, considering the new needs on the changing markets and the prevailing search for sustainability beyond profit, this paper portrays essential business model elements in an integrated framework. An updated generic business model framework consists of four primary categories, namely, value proposition, value capture, value creation, and value network, and could be useful for a variety of organisations, profit and non-profit, with various mission and vision orientations and interaction with the environment.

  17. MDM: A Mode Diagram Modeling Framework

    DEFF Research Database (Denmark)

    Wang, Zheng; Pu, Geguang; Li, Jianwen

    2012-01-01

    systems are widely used in the above-mentioned safety-critical embedded domains, there is lack of domain-specific formal modelling languages for such systems in the relevant industry. To address this problem, we propose a formal visual modeling framework called mode diagram as a concise and precise way...... to specify and analyze such systems. To capture the temporal properties of periodic control systems, we provide, along with mode diagram, a property specification language based on interval logic for the description of concrete temporal requirements the engineers are concerned with. The statistical model...... checking technique can then be used to verify the mode diagram models against desired properties. To demonstrate the viability of our approach, we have applied our modelling framework to some real life case studies from industry and helped detect two design defects for some spacecraft control systems....

  18. A Model-Driven Framework to Develop Personalized Health Monitoring

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-07-01

    Full Text Available Both distributed healthcare systems and the Internet of Things (IoT are currently hot topics. The latter is a new computing paradigm to enable advanced capabilities in engineering various applications, including those for healthcare. For such systems, the core social requirement is the privacy/security of the patient information along with the technical requirements (e.g., energy consumption and capabilities for adaptability and personalization. Typically, the functionality of the systems is predefined by the patient’s data collected using sensor networks along with medical instrumentation; then, the data is transferred through the Internet for treatment and decision-making. Therefore, systems creation is indeed challenging. In this paper, we propose a model-driven framework to develop the IoT-based prototype and its reference architecture for personalized health monitoring (PHM applications. The framework contains a multi-layered structure with feature-based modeling and feature model transformations at the top and the application software generation at the bottom. We have validated the framework using available tools and developed an experimental PHM to test some aspects of the functionality of the reference architecture in real time. The main contribution of the paper is the development of the model-driven computational framework with emphasis on the synergistic effect of security and energy issues.

  19. Submicrometric Magnetic Nanoporous Carbons Derived from Metal-Organic Frameworks Enabling Automated Electromagnet-Assisted Online Solid-Phase Extraction.

    Science.gov (United States)

    Frizzarin, Rejane M; Palomino Cabello, Carlos; Bauzà, Maria Del Mar; Portugal, Lindomar A; Maya, Fernando; Cerdà, Víctor; Estela, José M; Turnes Palomino, Gemma

    2016-07-19

    We present the first application of submicrometric magnetic nanoporous carbons (μMNPCs) as sorbents for automated solid-phase extraction (SPE). Small zeolitic imidazolate framework-67 crystals are obtained at room temperature and directly carbonized under an inert atmosphere to obtain submicrometric nanoporous carbons containing magnetic cobalt nanoparticles. The μMNPCs have a high contact area, high stability, and their preparation is simple and cost-effective. The prepared μMNPCs are exploited as sorbents in a microcolumn format in a sequential injection analysis (SIA) system with online spectrophotometric detection, which includes a specially designed three-dimensional (3D)-printed holder containing an automatically actuated electromagnet. The combined action of permanent magnets and an automatically actuated electromagnet enabled the movement of the solid bed of particles inside the microcolumn, preventing their aggregation, increasing the versatility of the system, and increasing the preconcentration efficiency. The method was optimized using a full factorial design and Doehlert Matrix. The developed system was applied to the determination of anionic surfactants, exploiting the retention of the ion-pairs formed with Methylene Blue on the μMNPC. Using sodium dodecyl sulfate as a model analyte, quantification was linear from 50 to 1000 μg L(-1), and the detection limit was equal to 17.5 μg L(-1), the coefficient of variation (n = 8; 100 μg L(-1)) was 2.7%, and the analysis throughput was 13 h(-1). The developed approach was applied to the determination of anionic surfactants in water samples (natural water, groundwater, and wastewater), yielding recoveries of 93% to 110% (95% confidence level).

  20. An IoT-enabled supply chain integration framework : empirical case studies

    OpenAIRE

    Wakenshaw, Susan Y. L.; Maple, Carsten; Chen, Daqiang; Micillo, Rosario

    2017-01-01

    Supply chain integration is crucial for supply chain performance, particularly in industry 4.0. With the proliferation of Internet of Things (IoT) and the use of cyber-physical systems, supply chain integration needs to be greatly enhanced. In this paper, we explore supply integration (process and application) in the supply chain network enabled by IoT. Using the case study method, we investigate technical and business applications of IoT in supply chains and how it can interface with the pro...

  1. Titan: An Enabling Framework for Activity-Aware "Pervasive Apps" in Opportunistic Personal Area Networks

    Directory of Open Access Journals (Sweden)

    Roggen Daniel

    2011-01-01

    Full Text Available Upcoming ambient intelligence environments will boast ever larger number of sensor nodes readily available on body, in objects, and in the user's surroundings. We envision "Pervasive Apps", user-centric activity-aware pervasive computing applications. They use available sensors for activity recognition. They are downloadable from application repositories, much like current Apps for mobile phones. A key challenge is to provide Pervasive Apps in open-ended environments where resource availability cannot be predicted. We therefore introduce Titan, a service-oriented framework supporting design, development, deployment, and execution of activity-aware Pervasive Apps. With Titan, mobile devices inquire surrounding nodes about available services. Internet-based application repositories compose applications based on available services as a service graph. The mobile device maps the service graph to Titan Nodes. The execution of the service graph is distributed and can be remapped at run time upon changing resource availability. The framework is geared to streaming data processing and machine learning, which is key for activity recognition. We demonstrate Titan in a pervasive gaming application involving smart dice and a sensorized wristband. We comparatively present the implementation cost and performance and discuss how novel machine learning methodologies may enhance the flexibility of the mapping of service graphs to opportunistically available nodes.

  2. Highly ordered mesoporous few-layer graphene frameworks enabled by fe3 o4 nanocrystal superlattices.

    Science.gov (United States)

    Jiao, Yucong; Han, Dandan; Liu, Limin; Ji, Li; Guo, Guannan; Hu, Jianhua; Yang, Dong; Dong, Angang

    2015-05-04

    While great progress has been achieved in the synthesis of ordered mesoporous carbons in the past decade, it still remains a challenge to prepare highly graphitic frameworks with ordered mesoporosity and high surface area. Reported herein is a simple synthetic methodology, based on the conversion of self-assembled superlattices of Fe3 O4 nanocrystals, to fabricate highly ordered mesoporous graphene frameworks (MGFs) with ultrathin pore walls consisting of three to six stacking graphene layers. The MGFs possess face-centered-cubic symmetry with interconnected mesoporosity, tunable pore width, and high surface area. Because of their unique architectures and superior structural durability, the MGFs exhibit excellent cycling stability and rate performance when used as anode materials for lithium-ion batteries, thus retaining a specific capacity of 520 mAh g(-1) at a current density of 300 mA g(-1) after 400 cycles. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Spatial Modeling for Resources Framework (SMRF)

    Science.gov (United States)

    Spatial Modeling for Resources Framework (SMRF) was developed by Dr. Scott Havens at the USDA Agricultural Research Service (ARS) in Boise, ID. SMRF was designed to increase the flexibility of taking measured weather data and distributing the point measurements across a watershed. SMRF was developed...

  4. Cytoview: Development of a cell modelling framework

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    after image processing we used virtual reality modelling language (VRML). Rendering and interactive visualization provided by VRML is compatible with CellML. VRML has been used not only to enable 3D visualization of cells, but also to represent the information with minimum amount of data still representing it to the ...

  5. Modeling Plume-Triggered, Melt-Enabled Lithospheric Delamination

    Science.gov (United States)

    Perry-Houts, J.; Humphreys, G.

    2015-12-01

    It has been suggested that arrival of the Yellowstone plume below North America triggered a lithospheric foundering event which aided the eruption of the Columbia River flood basalts. This hypothesis potentially accounts for some of the biggest mysteries related to the CRB's including their location as "off-track" plume volcanism; and the anomalous chemical signatures of the most voluminous units. The foundered lithosphere appears to be a remnant chunk of Farallon slab, which had been stranded beneath the Blue Mountains terrain since the accretion of Siletzia. If this is the case then the mechanisms by which this slab stayed metastable between Siletzia accretion and CRB time, and then so suddenly broke loose, is unclear. The addition of heat and mantle buoyancy supplied by the Yellowstone plume provides a clue, but the geodynamic process by which the slab was able to detach remains unclear.Efforts to model numerically the underlying processes behind delamination events have been gaining popularity. Typically, such models have relied on drastically weakened regions within the crust, or highly non-linear rheologies to enable initiation and propagation of lithosphere removal. Rather than impose such a weak region a priori, we investigated the role of mantle and crustal melt, generated by the addition of plume heat, as the source of such a rheologic boundary.We track melt generation and migration though geodynamic models using the Eulerian finite element code, ASPECT. Melt moves relative to the permeable, compacting, and viscously-deforming mantle using the approach of (Keller, et al. 2013) with the notable exception that ASPECT currently cannot model elasticity. Dike and sill emplacement is therefore still a work in progress. This work is still in the preliminary stages and results are yet inconclusive.

  6. An Extensible Model and Analysis Framework

    Science.gov (United States)

    2010-11-01

    of a pre-existing, open-source modeling and analysis framework known as Ptolemy II (http://ptolemy.org). The University of California, Berkeley...worked with the Air Force Research Laboratory, Rome Research Site on adapting Ptolemy II for modeling and simulation of large scale dynamics of Political...capabilities were prototyped in Ptolemy II and delivered via version control and software releases. Each of these capabilities specifically supports one or

  7. Enabling Proactivity in Context-aware Middleware Systems by means of a Planning Framework based on HTN Planning

    Directory of Open Access Journals (Sweden)

    Preeti Bhargava

    2015-08-01

    Full Text Available Today’s context-aware systems tend to be reactive or ‘pull’ based - the user requests or queries for some information and the system responds with the requested information. However, none of the systems anticipate the user’s intent and behavior, or take into account his current events and activities to pro-actively ‘push’ relevant information to the user. On the other hand, Proactive context-aware systems can predict and anticipate user intent and behavior, and act proactively on the users’ behalf without explicit requests from them. Two fundamental capabilities of such systems are: prediction and autonomy. In this paper, we address the second capability required by a context-aware system to act proactively i.e. acting autonomously without an explicit user request. To address it, we present a new paradigm for enabling proactivity in context-aware middleware systems by means of a Planning Framework based on HTN planning. We present the design of a Planning Framework within the infrastructure of our intelligent context-aware middleware called Rover II. We also implement this framework and evaluate its utility with several use cases. We also highlight the benefits of using such a framework in dynamic ubiquitous systems.

  8. A framework for API solubility modelling

    DEFF Research Database (Denmark)

    Conte, Elisa; Gani, Rafiqul; Crafts, Peter

    The solubility of solid organic compounds in water and organic solvents is a fundamental thermodynamic property for many purposes such as product-process design and optimization, for the chemical and pharmaceutical industry. Experimental literature solubility data are usually scarce and temperature......-dependent measurements are expensive in terms of time and resources. The few available data are badly organized and difficult to use for fast solubility calculations and solvent screening. Available models often require time consuming and complex implementation together with a good user expertise for their efficient use....... In addition, most of the models are not predictive and requires experimental data for the calculation of the needed parameters. This work aims at developing an efficient framework for the solubility modelling of Active Pharmaceutical Ingredients (API) in water and organic solvents. With this framework...

  9. An optoelectronic framework enabled by low-dimensional phase-change films

    Science.gov (United States)

    Hosseini, Peiman; Wright, C. David; Bhaskaran, Harish

    2014-07-01

    The development of materials whose refractive index can be optically transformed as desired, such as chalcogenide-based phase-change materials, has revolutionized the media and data storage industries by providing inexpensive, high-speed, portable and reliable platforms able to store vast quantities of data. Phase-change materials switch between two solid states--amorphous and crystalline--in response to a stimulus, such as heat, with an associated change in the physical properties of the material, including optical absorption, electrical conductance and Young's modulus. The initial applications of these materials (particularly the germanium antimony tellurium alloy Ge2Sb2Te5) exploited the reversible change in their optical properties in rewritable optical data storage technologies. More recently, the change in their electrical conductivity has also been extensively studied in the development of non-volatile phase-change memories. Here we show that by combining the optical and electronic property modulation of such materials, display and data visualization applications that go beyond data storage can be created. Using extremely thin phase-change materials and transparent conductors, we demonstrate electrically induced stable colour changes in both reflective and semi-transparent modes. Further, we show how a pixelated approach can be used in displays on both rigid and flexible films. This optoelectronic framework using low-dimensional phase-change materials has many likely applications, such as ultrafast, entirely solid-state displays with nanometre-scale pixels, semi-transparent `smart' glasses, `smart' contact lenses and artificial retina devices.

  10. An optoelectronic framework enabled by low-dimensional phase-change films.

    Science.gov (United States)

    Hosseini, Peiman; Wright, C David; Bhaskaran, Harish

    2014-07-10

    The development of materials whose refractive index can be optically transformed as desired, such as chalcogenide-based phase-change materials, has revolutionized the media and data storage industries by providing inexpensive, high-speed, portable and reliable platforms able to store vast quantities of data. Phase-change materials switch between two solid states--amorphous and crystalline--in response to a stimulus, such as heat, with an associated change in the physical properties of the material, including optical absorption, electrical conductance and Young's modulus. The initial applications of these materials (particularly the germanium antimony tellurium alloy Ge2Sb2Te5) exploited the reversible change in their optical properties in rewritable optical data storage technologies. More recently, the change in their electrical conductivity has also been extensively studied in the development of non-volatile phase-change memories. Here we show that by combining the optical and electronic property modulation of such materials, display and data visualization applications that go beyond data storage can be created. Using extremely thin phase-change materials and transparent conductors, we demonstrate electrically induced stable colour changes in both reflective and semi-transparent modes. Further, we show how a pixelated approach can be used in displays on both rigid and flexible films. This optoelectronic framework using low-dimensional phase-change materials has many likely applications, such as ultrafast, entirely solid-state displays with nanometre-scale pixels, semi-transparent 'smart' glasses, 'smart' contact lenses and artificial retina devices.

  11. Integration of Utilities Infrastructures in a Future Internet Enabled Smart City Framework

    Directory of Open Access Journals (Sweden)

    Luis Sánchez

    2013-10-01

    Full Text Available Improving efficiency of city services and facilitating a more sustainable development of cities are the main drivers of the smart city concept. Information and Communication Technologies (ICT play a crucial role in making cities smarter, more accessible and more open. In this paper we present a novel architecture exploiting major concepts from the Future Internet (FI paradigm addressing the challenges that need to be overcome when creating smarter cities. This architecture takes advantage of both the critical communications infrastructures already in place and owned by the utilities as well as of the infrastructure belonging to the city municipalities to accelerate efficient provision of existing and new city services. The paper highlights how FI technologies create the necessary glue and logic that allows the integration of current vertical and isolated city services into a holistic solution, which enables a huge forward leap for the efficiency and sustainability of our cities. Moreover, the paper describes a real-world prototype, that instantiates the aforementioned architecture, deployed in one of the parks of the city of Santander providing an autonomous public street lighting adaptation service. This prototype is a showcase on how added-value services can be seamlessly created on top of the proposed architecture.

  12. Integration of utilities infrastructures in a future internet enabled smart city framework.

    Science.gov (United States)

    Sánchez, Luis; Elicegui, Ignacio; Cuesta, Javier; Muñoz, Luis; Lanza, Jorge

    2013-10-25

    Improving efficiency of city services and facilitating a more sustainable development of cities are the main drivers of the smart city concept. Information and Communication Technologies (ICT) play a crucial role in making cities smarter, more accessible and more open. In this paper we present a novel architecture exploiting major concepts from the Future Internet (FI) paradigm addressing the challenges that need to be overcome when creating smarter cities. This architecture takes advantage of both the critical communications infrastructures already in place and owned by the utilities as well as of the infrastructure belonging to the city municipalities to accelerate efficient provision of existing and new city services. The paper highlights how FI technologies create the necessary glue and logic that allows the integration of current vertical and isolated city services into a holistic solution, which enables a huge forward leap for the efficiency and sustainability of our cities. Moreover, the paper describes a real-world prototype, that instantiates the aforementioned architecture, deployed in one of the parks of the city of Santander providing an autonomous public street lighting adaptation service. This prototype is a showcase on how added-value services can be seamlessly created on top of the proposed architecture.

  13. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    Science.gov (United States)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and

  14. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Piccoli, Luciano; Simone, James N; Kowalkowlski, James B; Dubey, Abhishek

    2010-01-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  15. Environmental Modeling Framework using Stacked Gaussian Processes

    OpenAIRE

    Abdelfatah, Kareem; Bao, Junshu; Terejanu, Gabriel

    2016-01-01

    A network of independently trained Gaussian processes (StackedGP) is introduced to obtain predictions of quantities of interest with quantified uncertainties. The main applications of the StackedGP framework are to integrate different datasets through model composition, enhance predictions of quantities of interest through a cascade of intermediate predictions, and to propagate uncertainties through emulated dynamical systems driven by uncertain forcing variables. By using analytical first an...

  16. Nowcasting Ground Magnetic Perturbations with the Space Weather Modeling Framework

    Science.gov (United States)

    Welling, D. T.; Toth, G.; Singer, H. J.; Millward, G. H.; Gombosi, T. I.

    2015-12-01

    Predicting ground-based magnetic perturbations is a critical step towards specifying and predicting geomagnetically induced currents (GICs) in high voltage transmission lines. Currently, the Space Weather Modeling Framework (SWMF), a flexible modeling framework for simulating the multi-scale space environment, is being transitioned from research to operational use (R2O) by NOAA's Space Weather Prediction Center. Upon completion of this transition, the SWMF will provide localized B/t predictions using real-time solar wind observations from L1 and the F10.7 proxy for EUV as model input. This presentation describes the operational SWMF setup and summarizes the changes made to the code to enable R2O progress. The framework's algorithm for calculating ground-based magnetometer observations will be reviewed. Metrics from data-model comparisons will be reviewed to illustrate predictive capabilities. Early data products, such as regional-K index and grids of virtual magnetometer stations, will be presented. Finally, early successes will be shared, including the code's ability to reproduce the recent March 2015 St. Patrick's Day Storm.

  17. Framework for the Parametric System Modeling of Space Exploration Architectures

    Science.gov (United States)

    Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II

    2008-01-01

    This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.

  18. An IT-enabled supply chain model: a simulation study

    Science.gov (United States)

    Cannella, Salvatore; Framinan, Jose M.; Barbosa-Póvoa, Ana

    2014-11-01

    During the last decades, supply chain collaboration practices and the underlying enabling technologies have evolved from the classical electronic data interchange (EDI) approach to a web-based and radio frequency identification (RFID)-enabled collaboration. In this field, most of the literature has focused on the study of optimal parameters for reducing the total cost of suppliers, by adopting operational research (OR) techniques. Herein we are interested in showing that the considered information technology (IT)-enabled structure is resilient, that is, it works well across a reasonably broad range of parameter settings. By adopting a methodological approach based on system dynamics, we study a multi-tier collaborative supply chain. Results show that the IT-enabled supply chain improves operational performance and customer service level. Nonetheless, benefits for geographically dispersed networks are of minor entity.

  19. Reconfigurable Model Execution in the OpenMDAO Framework

    Science.gov (United States)

    Hwang, John T.

    2017-01-01

    NASA's OpenMDAO framework facilitates constructing complex models and computing their derivatives for multidisciplinary design optimization. Decomposing a model into components that follow a prescribed interface enables OpenMDAO to assemble multidisciplinary derivatives from the component derivatives using what amounts to the adjoint method, direct method, chain rule, global sensitivity equations, or any combination thereof, using the MAUD architecture. OpenMDAO also handles the distribution of processors among the disciplines by hierarchically grouping the components, and it automates the data transfer between components that are on different processors. These features have made OpenMDAO useful for applications in aircraft design, satellite design, wind turbine design, and aircraft engine design, among others. This paper presents new algorithms for OpenMDAO that enable reconfigurable model execution. This concept refers to dynamically changing, during execution, one or more of: the variable sizes, solution algorithm, parallel load balancing, or set of variables-i.e., adding and removing components, perhaps to switch to a higher-fidelity sub-model. Any component can reconfigure at any point, even when running in parallel with other components, and the reconfiguration algorithm presented here performs the synchronized updates to all other components that are affected. A reconfigurable software framework for multidisciplinary design optimization enables new adaptive solvers, adaptive parallelization, and new applications such as gradient-based optimization with overset flow solvers and adaptive mesh refinement. Benchmarking results demonstrate the time savings for reconfiguration compared to setting up the model again from scratch, which can be significant in large-scale problems. Additionally, the new reconfigurability feature is applied to a mission profile optimization problem for commercial aircraft where both the parametrization of the mission profile and the

  20. A Framework for Intelligent Voice-Enabled E-Education Systems

    OpenAIRE

    A., Azeta A.; K., Ayo C.; A., Ikhu-Omoregbe N.; A., Atayero A.

    2009-01-01

    Although the Internet has received significant attention in recent years, voice is still the most convenient and natural way of communicating between human to human or human to computer. In voice applications, users may have different needs which will require the ability of the system to reason, make decisions, be flexible and adapt to requests during interaction. These needs have placed new requirements in voice application development such as use of advanced models, techniques and methodolo...

  1. Creating Data and Modeling Enabled Hydrology Instruction Using Collaborative Approach

    Science.gov (United States)

    Merwade, V.; Rajib, A.; Ruddell, B. L.; Fox, S.

    2017-12-01

    Hydrology instruction typically involves teaching of the hydrologic cycle and the processes associated with it such as precipitation, evapotranspiration, infiltration, runoff generation and hydrograph analysis. With the availability of observed and remotely sensed data related to many hydrologic fluxes, there is an opportunity to use these data for place based learning in hydrology classrooms. However, it is not always easy and possible for an instructor to complement an existing hydrology course with new material that requires both the time and technical expertise, which the instructor may not have. The work presented here describes an effort where students create the data and modeling driven instruction material as a part of their class assignment for a hydrology course at Purdue University. The data driven hydrology education project within Science Education Resources Center (SERC) is used as a platform to publish and share the instruction material so it can be used by future students in the same course or any other course anywhere in the world. Students in the class were divided into groups, and each group was assigned a topic such as precipitation, evapotranspiration, streamflow, flow duration curve and frequency analysis. Each student in the group was then asked to get data and do some analysis for an area with specific landuse characteristic such as urban, rural and agricultural. The student contribution were then organized into learning units such that someone can do a flow duration curve analysis or flood frequency analysis to see how it changes for rural area versus urban area. The hydrology education project within SERC cyberinfrastructure enables any other instructor to adopt this material as is or through modification to suit his/her place based instruction needs.

  2. Modelling multimedia teleservices with OSI upper layers framework: Short paper

    Science.gov (United States)

    Widya, I.; Vanrijssen, E.; Michiels, E.

    The paper presents the use of the concepts and modelling principles of the Open Systems Interconnection (OSI) upper layers structure in the modelling of multimedia teleservices. It puts emphasis on the revised Application Layer Structure (OSI/ALS). OSI/ALS is an object based reference model which intends to coordinate the development of application oriented services and protocols in a consistent and modular way. It enables the rapid deployment and integrated use of these services. The paper emphasizes further on the nesting structure defined in OSI/ALS which allows the design of scalable and user tailorable/controllable teleservices. OSI/ALS consistent teleservices are moreover implementable on communication platforms of different capabilities. An analysis of distributed multimedia architectures which can be found in the literature, confirms the ability of the OSI/ALS framework to model the interworking functionalities of teleservices.

  3. `Dhara': An Open Framework for Critical Zone Modeling

    Science.gov (United States)

    Le, P. V.; Kumar, P.

    2016-12-01

    Processes in the Critical Zone, which sustain terrestrial life, are tightly coupled across hydrological, physical, biological, chemical, pedological, geomorphological and ecological domains over both short and long timescales. Observations and quantification of the Earth's surface across these domains using emerging high resolution measurement technologies such as light detection and ranging (lidar) and hyperspectral remote sensing are enabling us to characterize fine scale landscape attributes over large spatial areas. This presents a unique opportunity to develop novel approaches to model the Critical Zone that can capture fine scale intricate dependencies across the different processes in 3D. The development of interdisciplinary tools that transcend individual disciplines and capture new levels of complexity and emergent properties is at the core of Critical Zone science. Here we introduce an open framework for high-performance computing model (`Dhara') for modeling complex processes in the Critical Zone. The framework is designed to be modular in structure with the aim to create uniform and efficient tools to facilitate and leverage process modeling. It also provides flexibility to maintain, collaborate, and co-develop additional components by the scientific community. We show the essential framework that simulates ecohydrologic dynamics, and surface - sub-surface coupling in 3D using hybrid parallel CPU-GPU. We demonstrate that the open framework in Dhara is feasible for detailed, multi-processes, and large-scale modeling of the Critical Zone, which opens up exciting possibilities. We will also present outcomes from a Modeling Summer Institute led by Intensively Managed Critical Zone Observatory (IMLCZO) with representation from several CZOs and international representatives.

  4. An automated framework for QSAR model building.

    Science.gov (United States)

    Kausar, Samina; Falcao, Andre O

    2018-01-16

    In-silico quantitative structure-activity relationship (QSAR) models based tools are widely used to screen huge databases of compounds in order to determine the biological properties of chemical molecules based on their chemical structure. With the passage of time, the exponentially growing amount of synthesized and known chemicals data demands computationally efficient automated QSAR modeling tools, available to researchers that may lack extensive knowledge of machine learning modeling. Thus, a fully automated and advanced modeling platform can be an important addition to the QSAR community. In the presented workflow the process from data preparation to model building and validation has been completely automated. The most critical modeling tasks (data curation, data set characteristics evaluation, variable selection and validation) that largely influence the performance of QSAR models were focused. It is also included the ability to quickly evaluate the feasibility of a given data set to be modeled. The developed framework is tested on data sets of thirty different problems. The best-optimized feature selection methodology in the developed workflow is able to remove 62-99% of all redundant data. On average, about 19% of the prediction error was reduced by using feature selection producing an increase of 49% in the percentage of variance explained (PVE) compared to models without feature selection. Selecting only the models with a modelability score above 0.6, average PVE scores were 0.71. A strong correlation was verified between the modelability scores and the PVE of the models produced with variable selection. We developed an extendable and highly customizable fully automated QSAR modeling framework. This designed workflow does not require any advanced parameterization nor depends on users decisions or expertise in machine learning/programming. With just a given target or problem, the workflow follows an unbiased standard protocol to develop reliable QSAR models

  5. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  6. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice

    NARCIS (Netherlands)

    Flottorp, S.A.; Oxman, A.D.; Krause, J.; Musila, N.R.; Wensing, M.; Godycki-Cwirko, M.; Baker, R.; Eccles, M.P.

    2013-01-01

    BACKGROUND: Determinants of practice are factors that might prevent or enable improvements. Several checklists, frameworks, taxonomies, and classifications of determinants of healthcare professional practice have been published. In this paper, we describe the development of a comprehensive,

  7. Understanding enabling capacities for managing the 'wicked problem' of nonpoint source water pollution in catchments: a conceptual framework.

    Science.gov (United States)

    Patterson, James J; Smith, Carl; Bellamy, Jennifer

    2013-10-15

    Nonpoint source (NPS) water pollution in catchments is a 'wicked' problem that threatens water quality, water security, ecosystem health and biodiversity, and thus the provision of ecosystem services that support human livelihoods and wellbeing from local to global scales. However, it is a difficult problem to manage because water catchments are linked human and natural systems that are complex, dynamic, multi-actor, and multi-scalar in nature. This in turn raises questions about understanding and influencing change across multiple levels of planning, decision-making and action. A key challenge in practice is enabling implementation of local management action, which can be influenced by a range of factors across multiple levels. This paper reviews and synthesises important 'enabling' capacities that can influence implementation of local management action, and develops a conceptual framework for understanding and analysing these in practice. Important enabling capacities identified include: history and contingency; institutional arrangements; collaboration; engagement; vision and strategy; knowledge building and brokerage; resourcing; entrepreneurship and leadership; and reflection and adaptation. Furthermore, local action is embedded within multi-scalar contexts and therefore, is highly contextual. The findings highlight the need for: (1) a systemic and integrative perspective for understanding and influencing change for managing the wicked problem of NPS water pollution; and (2) 'enabling' social and institutional arenas that support emergent and adaptive management structures, processes and innovations for addressing NPS water pollution in practice. These findings also have wider relevance to other 'wicked' natural resource management issues facing similar implementation challenges. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. MDM: A Mode Diagram Modeling Framework

    Directory of Open Access Journals (Sweden)

    Zheng Wang

    2012-12-01

    Full Text Available Periodic control systems used in spacecrafts and automotives are usually period-driven and can be decomposed into different modes with each mode representing a system state observed from outside. Such systems may also involve intensive computing in their modes. Despite the fact that such control systems are widely used in the above-mentioned safety-critical embedded domains, there is lack of domain-specific formal modelling languages for such systems in the relevant industry. To address this problem, we propose a formal visual modeling framework called mode diagram as a concise and precise way to specify and analyze such systems. To capture the temporal properties of periodic control systems, we provide, along with mode diagram, a property specification language based on interval logic for the description of concrete temporal requirements the engineers are concerned with. The statistical model checking technique can then be used to verify the mode diagram models against desired properties. To demonstrate the viability of our approach, we have applied our modelling framework to some real life case studies from industry and helped detect two design defects for some spacecraft control systems.

  9. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    Science.gov (United States)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications

  10. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    Science.gov (United States)

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  11. FLUKA-LIVE-an embedded framework, for enabling a computer to execute FLUKA under the control of a Linux OS

    International Nuclear Information System (INIS)

    Cohen, A.; Battistoni, G.; Mark, S.

    2008-01-01

    This paper describes a Linux-based OS framework for integrating the FLUKA Monte Carlo software (currently distributed only for Linux) into a CD-ROM, resulting in a complete environment for a scientist to edit, link and run FLUKA routines-without the need to install a UNIX/Linux operating system. The building process includes generating from scratch a complete operating system distribution which will, when operative, build all necessary components for successful operation of FLUKA software and libraries. Various source packages, as well as the latest kernel sources, are freely available from the Internet. These sources are used to create a functioning Linux system that integrates several core utilities in line with the main idea-enabling FLUKA to act as if it was running under a popular Linux distribution or even a proprietary UNIX workstation. On boot-up a file system will be created and the contents from the CD will be uncompressed and completely loaded into RAM-after which the presence of the CD is no longer necessary, and could be removed for use on a second computer. The system can operate on any i386 PC as long as it can boot from a CD

  12. Versatile Surface Functionalization of Metal-Organic Frameworks through Direct Metal Coordination with a Phenolic Lipid Enables Diverse Applications

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Wei [Univ. of New Mexico, Albuquerque, NM (United States); Xiang, Guolei [Univ. of Cambridge (United Kingdom); Shang, Jin [Univ. of Hong Kong (China); Guo, Jimin [Univ. of New Mexico, Albuquerque, NM (United States); Motevalli, Benyamin [Monash Univ., Clayton, VIC (Australia); Durfee, Paul [Univ. of New Mexico, Albuquerque, NM (United States); Agola, Jacob Ongudi [Univ. of New Mexico, Albuquerque, NM (United States); Coker, Eric N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brinker, C. Jeffrey [Univ. of New Mexico, Albuquerque, NM (United States); Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2018-02-22

    Here, a novel strategy for the versatile functionalization of the external surface of metal-organic frameworks (MOFs) has been developed based on the direct coordination of a phenolic-inspired lipid molecule DPGG (1,2-dipalmitoyl-sn-glycero-3-galloyl) with metal nodes/sites surrounding MOF surface. X-ray diffraction and Argon sorption analysis prove that the modified MOF particles retain their structural integrity and porosity after surface modification. Density functional theory calculations reveal that strong chelation strength between the metal sites and the galloyl head group of DPGG is the basic prerequisite for successful coating. Due to the pH-responsive nature of metal-phenol complexation, the modification process is reversible by simple washing in weak acidic water, showing an excellent regeneration ability for water-stable MOFs. Moreover, the colloidal stability of the modified MOFs in the nonpolar solvent allows them to be further organized into 2 dimensional MOF or MOF/polymer monolayers by evaporation-induced interfacial assembly conducted on an air/water interface. Lastly, the easy fusion of a second functional layer onto DPGG-modified MOF cores, enabled a series of MOF-based functional nanoarchitectures, such as MOFs encapsulated within hybrid supported lipid bilayers (so-called protocells), polyhedral core-shell structures, hybrid lipid-modified-plasmonic vesicles and multicomponent supraparticles with target functionalities, to be generated. for a wide range of applications.

  13. Enabling Business Model Change: Evidence from High-Technology Firms

    Directory of Open Access Journals (Sweden)

    Christiana Müller

    2015-01-01

    Full Text Available Companies today face volatie environments, short product life cycles, and changing customer requirements, which is especially the case in high-technology filds. In such environments, concentratig only on technological and product innovatins is not suffient to gain competiie advantages. Instead, companies need innovatie business models in order to stand out from their competiors. To successfully change business models, companies require appropriate competencies. Thus, the objectie of this research is to identiy how companies can prepare their business model(s to counteract environmental changes flxibly. With the aid of the chosen exploratory, qualitatie research design, we investiate companies operatig in hightechnology branches. In total, 20 companies partiipated in our study. The interviews were conducted with CEOs, vice-presidents, product managers or other managers responsible for business model developments. The research revealed that companies can prepare the business model and its elements ex ante through developing capabilitis in order to raise the flxibility of the business model. These capabilitis have to be developed with regard to several internal and external issues driving these changes.

  14. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  15. A Smallholder Socio-hydrological Modelling Framework

    Science.gov (United States)

    Pande, S.; Savenije, H.; Rathore, P.

    2014-12-01

    Small holders are farmers who own less than 2 ha of farmland. They often have low productivity and thus remain at subsistence level. A fact that nearly 80% of Indian farmers are smallholders, who merely own a third of total farmlands and belong to the poorest quartile, but produce nearly 40% of countries foodgrains underlines the importance of understanding the socio-hydrology of a small holder. We present a framework to understand the socio-hydrological system dynamics of a small holder. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. It allows us to study sustainability of small holder farming systems under various settings. We apply the framework to understand the socio-hydrology of small holders in Aurangabad, Maharashtra, India. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydroclimatic variability. This presentation discusses two aspects in particular: whether government interventions to absolve the debt of farmers is enough and what is the value of investing in local storages that can buffer intra-annual variability in rainfall and strengthening the safety-nets either by creating opportunities for alternative sources of income or by crop diversification.

  16. Computer Modeling of Carbon Metabolism Enables Biofuel Engineering (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2011-09-01

    In an effort to reduce the cost of biofuels, the National Renewable Energy Laboratory (NREL) has merged biochemistry with modern computing and mathematics. The result is a model of carbon metabolism that will help researchers understand and engineer the process of photosynthesis for optimal biofuel production.

  17. Systematic identification of crystallization kinetics within a generic modelling framework

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Gernaey, Krist

    2012-01-01

    A systematic development of constitutive models within a generic modelling framework has been developed for use in design, analysis and simulation of crystallization operations. The framework contains a tool for model identification connected with a generic crystallizer modelling tool-box, a tool...

  18. SPSPR Model - Framework for ICT Services Management

    Directory of Open Access Journals (Sweden)

    Jiri Vorisek

    2011-04-01

    Full Text Available In this paper we discuss existing frameworks for the management of ICT services and their limitations in the context of emerging enterprise computing environment characterized by use of externally sourced services. We identify the requirements for a service management framework with particular focus on definition and categorization of ICT services that facilitates the development of a service catalogue. The main section of this paper describes our approach to ICT service management as embodied in the SPSPR framework.

  19. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  20. Exploring Business Models for NFC Enabled Mobile Payment Services

    OpenAIRE

    Chae, Sang-Un; hedman, Jonas

    2013-01-01

    Over the past few years, mobile payments have been present like a storm on the horizon. They have generated a lot of attention; yet have not reached wide adoption. Issues such as the complexity of the mobile payment ecosystem and the lack of sustainable business models have been accounted for the slow market penetration. With the rise of new technologies such as NFC, the mobile payment sphere experiences a new height of talk, which materialized in a second wave of companies enteri...

  1. Web-Enabled Distributed Health-Care Framework for Automated Malaria Parasite Classification: an E-Health Approach.

    Science.gov (United States)

    Maity, Maitreya; Dhane, Dhiraj; Mungle, Tushar; Maiti, A K; Chakraborty, Chandan

    2017-10-26

    Web-enabled e-healthcare system or computer assisted disease diagnosis has a potential to improve the quality and service of conventional healthcare delivery approach. The article describes the design and development of a web-based distributed healthcare management system for medical information and quantitative evaluation of microscopic images using machine learning approach for malaria. In the proposed study, all the health-care centres are connected in a distributed computer network. Each peripheral centre manages its' own health-care service independently and communicates with the central server for remote assistance. The proposed methodology for automated evaluation of parasites includes pre-processing of blood smear microscopic images followed by erythrocytes segmentation. To differentiate between different parasites; a total of 138 quantitative features characterising colour, morphology, and texture are extracted from segmented erythrocytes. An integrated pattern classification framework is designed where four feature selection methods viz. Correlation-based Feature Selection (CFS), Chi-square, Information Gain, and RELIEF are employed with three different classifiers i.e. Naive Bayes', C4.5, and Instance-Based Learning (IB1) individually. Optimal features subset with the best classifier is selected for achieving maximum diagnostic precision. It is seen that the proposed method achieved with 99.2% sensitivity and 99.6% specificity by combining CFS and C4.5 in comparison with other methods. Moreover, the web-based tool is entirely designed using open standards like Java for a web application, ImageJ for image processing, and WEKA for data mining considering its feasibility in rural places with minimal health care facilities.

  2. Modeling interfacial dynamics using nonequilibrium thermodynamics frameworks

    NARCIS (Netherlands)

    Sagis, L.M.C.

    2013-01-01

    In recent years several nonequilibrium thermodynamic frameworks have been developed capable of describing the dynamics of multiphase systems with complex microstructured interfaces. In this paper we present an overview of these frameworks. We will discuss interfacial dynamics in the context of the

  3. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  4. The framework for simulation of dynamics of mechanical aggregates

    OpenAIRE

    Ivankov, Petr R.; Ivankov, Nikolay P.

    2007-01-01

    A framework for simulation of dynamics of mechanical aggregates has been developed. This framework enables us to build model of aggregate from models of its parts. Framework is a part of universal framework for science and engineering.

  5. SPSPR Model - Framework for ICT Services Management

    OpenAIRE

    Jiri Vorisek; Jaroslav Jandos; Jiri Feuerlicht

    2011-01-01

    In this paper we discuss existing frameworks for the management of ICT services and their limitations in the context of emerging enterprise computing environment characterized by use of externally sourced services. We identify the requirements for a service management framework with particular focus on definition and categorization of ICT services that facilitates the development of a service catalogue. The main section of this paper describes our approach to ICT service management as embodie...

  6. Business model framework applications in health care: A systematic review.

    Science.gov (United States)

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-11-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  7. A Framework for Formal Modeling and Analysis of Organizations

    NARCIS (Netherlands)

    Jonker, C.M.; Sharpanskykh, O.; Treur, J.; P., Yolum

    2007-01-01

    A new, formal, role-based, framework for modeling and analyzing both real world and artificial organizations is introduced. It exploits static and dynamic properties of the organizational model and includes the (frequently ignored) environment. The transition is described from a generic framework of

  8. TESTING BRAND VALUE MEASUREMENT METHODS IN A RANDOM COEFFICIENT MODELING FRAMEWORK

    OpenAIRE

    Szõcs Attila

    2014-01-01

    Our objective is to provide a framework for measuring brand equity, that is, the added value to the product endowed by the brand. Based on a demand and supply model, we propose a structural model that enables testing the structural effect of brand equity (demand side effect) on brand value (supply side effect), using Monte Carlo simulation. Our main research question is which of the three brand value measurement methods (price premium, revenue premium and profit premium) is more suitable from...

  9. A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model

    Science.gov (United States)

    Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi

    Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.

  10. Software Infrastructure to Enable Modeling & Simulation as a Service (M&SaaS), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase 2 project will produce a software service infrastructure that enables most modeling and simulation (M&S) activities from code development and...

  11. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  12. A Modeling Framework for Conventional and Heat Integrated Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2013-01-01

    In this paper, a generic, modular model framework for describing fluid separation by distillation is presented. At present, the framework is able to describe a conventional distillation column and a heat-integrated distillation column, but due to a modular structure the database can be further...... extended by additional congurations. The framework provides the basis for fair comparison of both steady state and dynamic performance of the dierent column congurations for a given binary or multicomponent separation....

  13. Solving system integration and interoperability problems using a model reference systems engineering framework

    Science.gov (United States)

    Makhlouf, Mahmoud A.

    2001-09-01

    This paper presents a model-reference systems engineering framework, which is applied on a number of ESC projects. This framework provides an architecture-driven system engineering process supported by a tool kit. This kit is built incrementally using an integrated set of commercial and government developed tools. These tools include project management, systems engineering, military worth-analysis and enterprise collaboration tools. Products developed using these tools enable the specification and visualization of an executable model of the integrated system architecture as it evolves from a low fidelity concept into a high fidelity system model. This enables end users of system products, system designers, and decision-makers; to perform what if analyses on system design alternatives before making costly final system acquisition decisions.

  14. An Ontology-Based Framework for Modeling User Behavior

    DEFF Research Database (Denmark)

    Razmerita, Liana

    2011-01-01

    This paper focuses on the role of user modeling and semantically enhanced representations for personalization. This paper presents a generic Ontology-based User Modeling framework (OntobUMf), its components, and its associated user modeling processes. This framework models the behavior of the users...... and classifies its users according to their behavior. The user ontology is the backbone of OntobUMf and has been designed according to the Information Management System Learning Information Package (IMS LIP). The user ontology includes a Behavior concept that extends IMS LIP specification and defines....... The results of this research may contribute to the development of other frameworks for modeling user behavior, other semantically enhanced user modeling frameworks, or other semantically enhanced information systems....

  15. Service business model framework and the service innovation scope

    OpenAIRE

    van der Aa, W.; van der Rhee, B.; Victorino, L.

    2011-01-01

    In this paper we present a framework for service business models. We build on three streams of research. The first stream is the service management and marketing literature that focuses on the specific challenges of managing a service business. The second stream consists of research on e-business models. The third and most recent stream of research includes frameworks and business models from strategic management and innovation management. The next step in our research is the development of a...

  16. Generic Model Predictive Control Framework for Advanced Driver Assistance Systems

    NARCIS (Netherlands)

    Wang, M.

    2014-01-01

    This thesis deals with a model predictive control framework for control design of Advanced Driver Assistance Systems, where car-following tasks are under control. The framework is applied to design several autonomous and cooperative controllers and to examine the controller properties at the

  17. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    . The Guided System Development framework contributes to more secure communication systems by aiding the development of such systems. The framework features a simple modelling language, step-wise refinement from models to implementation, interfaces to security verification tools, and code generation from...... the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...

  18. Conceptualising Business Models: Definitions, Frameworks and Classifications

    Directory of Open Access Journals (Sweden)

    Erwin Fielt

    2013-12-01

    Full Text Available The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1 explicitly including the customer value concept in the business model definition and focussing on value creation, (2 presenting four core dimensions that business model elements need to cover, (3 arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy (4 stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5 suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.

  19. A Policy Framework for Joint Use: Enabling and Supporting Community Use of K-12 Public School Facilities

    Science.gov (United States)

    Filardo, Mary; Vincent, Jeffrey M.

    2014-01-01

    Joint use of public school facilities is a complex but manageable approach to efficiently enhancing the services and programs available to students and supporting the community use of public schools. Building upon on our 2010 paper titled "Joint Use of Public Schools: A Framework for a New Social Contract," this paper identifies the…

  20. POSITIVE LEADERSHIP MODELS: THEORETICAL FRAMEWORK AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Javier Blanch, Francisco Gil

    2016-09-01

    Full Text Available The objective of this article is twofold; firstly, we establish the theoretical boundaries of positive leadership and the reasons for its emergence. It is related to the new paradigm of positive psychology that has recently been shaping the scope of organizational knowledge. This conceptual framework has triggered the development of the various forms of positive leadership (i.e. transformational, servant, spiritual, authentic, and positive. Although the construct does not seem univocally defined, these different types of leadership overlap and share a significant affinity. Secondly, we review the empirical evidence that shows the impact of positive leadership in organizations and we highlight the positive relationship between these forms of leadership and key positive organizational variables. Lastly, we analyse future research areas in order to further develop this concept.

  1. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  2. Futures Business Models for an IoT Enabled Healthcare Sector: A Causal Layered Analysis Perspective

    OpenAIRE

    Julius Francis Gomes; Sara Moqaddemerad

    2016-01-01

    Purpose: To facilitate futures business research by proposing a novel way to combine business models as a conceptual tool with futures research techniques. Design: A futures perspective is adopted to foresight business models of the Internet of Things (IoT) enabled healthcare sector by using business models as a futures business research tool. In doing so, business models is coupled with one of the most prominent foresight methodologies, Causal Layered Analysis (CLA). Qualitative analysis...

  3. Organizational Models for Non-Core Processes Management: A Classification Framework

    Directory of Open Access Journals (Sweden)

    Alberto F. De Toni

    2012-12-01

    The framework enables the identification and the explanation of the main advantages and disadvantages of each strategy and to highlight how a company should coherently choose an organizational model on the basis of: (a the specialization/complexity of the non‐core processes, (b the focus on core processes, (c its inclination towards know‐how outsourcing, and (d the desired level of autonomy in the management of non‐core processes.

  4. Modelling framework for groundwater flow at Sellafield

    International Nuclear Information System (INIS)

    Hooper, A.J.; Billington, D.E.; Herbert, A.W.

    1995-01-01

    The principal objective of Nirex is to develop a single deep geological repository for the safe disposal of low- and intermediate-level radioactive waste. In safety assessment, use is made of a variety of conceptual models that form the basis for modelling of the pathways by which radionuclides might return to the environment. In this paper, the development of a conceptual model for groundwater flow and transport through fractured rock on the various scales of interest is discussed. The approach is illustrated by considering how some aspects of the conceptual model are developed in particular numerical models. These representations of the conceptual model use fracture network geometries based on realistic rock properties. (author). refs., figs., tabs

  5. Evaluating alternate discrete outcome frameworks for modeling crash injury severity.

    Science.gov (United States)

    Yasmin, Shamsunnahar; Eluru, Naveen

    2013-10-01

    This paper focuses on the relevance of alternate discrete outcome frameworks for modeling driver injury severity. The study empirically compares the ordered response and unordered response models in the context of driver injury severity in traffic crashes. The alternative modeling approaches considered for the comparison exercise include: for the ordered response framework-ordered logit (OL), generalized ordered logit (GOL), mixed generalized ordered logit (MGOL) and for the unordered response framework-multinomial logit (MNL), nested logit (NL), ordered generalized extreme value logit (OGEV) and mixed multinomial logit (MMNL) model. A host of comparison metrics are computed to evaluate the performance of these alternative models. The study provides a comprehensive comparison exercise of the performance of ordered and unordered response models for examining the impact of exogenous factors on driver injury severity. The research also explores the effect of potential underreporting on alternative frameworks by artificially creating an underreported data sample from the driver injury severity sample. The empirical analysis is based on the 2010 General Estimates System (GES) data base-a nationally representative sample of road crashes collected and compiled from about 60 jurisdictions across the United States. The performance of the alternative frameworks are examined in the context of model estimation and validation (at the aggregate and disaggregate level). Further, the performance of the model frameworks in the presence of underreporting is explored, with and without corrections to the estimates. The results from these extensive analyses point toward the emergence of the GOL framework (MGOL) as a strong competitor to the MMNL model in modeling driver injury severity. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. TESTING BRAND VALUE MEASUREMENT METHODS IN A RANDOM COEFFICIENT MODELING FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Szõcs Attila

    2014-07-01

    Full Text Available Our objective is to provide a framework for measuring brand equity, that is, the added value to the product endowed by the brand. Based on a demand and supply model, we propose a structural model that enables testing the structural effect of brand equity (demand side effect on brand value (supply side effect, using Monte Carlo simulation. Our main research question is which of the three brand value measurement methods (price premium, revenue premium and profit premium is more suitable from the perspective of the structural link between brand equity and brand value. Our model is based on recent developments in random coefficients model applications.

  7. A system-level multiprocessor system-on-chip modeling framework

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2004-01-01

    We present a system-level modeling framework to model system-on-chips (SoC) consisting of heterogeneous multiprocessors and network-on-chip communication structures in order to enable the developers of today's SoC designs to take advantage of the flexibility and scalability of network-on-chip...... and rapidly explore high-level design alternatives to meet their system requirements. We present a modeling approach for developing high-level performance models for these SoC designs and outline how this system-level performance analysis capability can be integrated into an overall environment for efficient...

  8. A Computational Framework for Realistic Retina Modeling.

    Science.gov (United States)

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  9. A Framework for Cloudy Model Optimization and Database Storage

    Science.gov (United States)

    Calvén, Emilia; Helton, Andrew; Sankrit, Ravi

    2018-01-01

    We present a framework for producing Cloudy photoionization models of the nebular emission from novae ejecta and storing a subset of the results in SQL database format for later usage. The database can be searched for models best fitting observed spectral line ratios. Additionally, the framework includes an optimization feature that can be used in tandem with the database to search for and improve on models by creating new Cloudy models while, varying the parameters. The database search and optimization can be used to explore the structures of nebulae by deriving their properties from the best-fit models. The goal is to provide the community with a large database of Cloudy photoionization models, generated from parameters reflecting conditions within novae ejecta, that can be easily fitted to observed spectral lines; either by directly accessing the database using the framework code or by usage of a website specifically made for this purpose.

  10. An Ising model for metal-organic frameworks

    Science.gov (United States)

    Höft, Nicolas; Horbach, Jürgen; Martín-Mayor, Victor; Seoane, Beatriz

    2017-08-01

    We present a three-dimensional Ising model where lines of equal spins are frozen such that they form an ordered framework structure. The frame spins impose an external field on the rest of the spins (active spins). We demonstrate that this "porous Ising model" can be seen as a minimal model for condensation transitions of gas molecules in metal-organic frameworks. Using Monte Carlo simulation techniques, we compare the phase behavior of a porous Ising model with that of a particle-based model for the condensation of methane (CH4) in the isoreticular metal-organic framework IRMOF-16. For both models, we find a line of first-order phase transitions that end in a critical point. We show that the critical behavior in both cases belongs to the 3D Ising universality class, in contrast to other phase transitions in confinement such as capillary condensation.

  11. Public–private partnership conceptual framework and models for the ...

    African Journals Online (AJOL)

    The framework for PPPs identified three models, viz. state, hybrid and private sector models. In the 'state model' the water services value chain is 100% government funded and owned infrastructure. Government is a key player in infrastructure investment and inefficiencies within the public expenditure management systems ...

  12. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...

  13. Theories and Frameworks for Online Education: Seeking an Integrated Model

    Science.gov (United States)

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  14. Barriers and Enablers to Implementation of Dietary Guidelines in Early Childhood Education Centers in Australia: Application of the Theoretical Domains Framework.

    Science.gov (United States)

    Grady, Alice; Seward, Kirsty; Finch, Meghan; Fielding, Alison; Stacey, Fiona; Jones, Jannah; Wolfenden, Luke; Yoong, Sze Lin

    2018-03-01

    To identify perceived barriers and enablers to implementation of dietary guidelines reported by early childhood education center cooks, and barriers and enablers associated with greater implementation based on assessment of center menu compliance. Cross-sectional telephone interview. Early childhood education centers, New South Wales, Australia. A total of 202 cooks responsible for menu planning; 70 centers provided a menu for review of compliance with dietary guidelines. Barriers and enablers to dietary guideline implementation were determined using a tool assessing constructs of the Theoretical Domains Framework (TDF). Higher scores (≥6) for each construct indicated enablers to guideline implementation; lower scores (<6) suggested barriers. Multivariable linear regression identified TDF constructs associated with greater guideline implementation. Scores were lowest for reinforcement (mean, 5.85) and goals (mean, 5.89) domains, and highest for beliefs about consequences (mean, 6.51) and social/professional role and identity (mean, 6.50). The skills domain was positively associated with greater implementation of guidelines based on menu review (P = .01). Cooks perceived social/professional role and identity, and beliefs about consequences to be enablers to dietary guideline implementation; however, only the skills domain was associated with greater implementation. There are opportunities to target the incongruence in perceptions vs reality of the barriers and enablers to implementation. Future research could examine the utility of the TDF to identify barriers and enablers to implementation to inform intervention development and for evaluating interventions to examine intervention mechanisms. Copyright © 2017 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  15. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  16. Real time natural object modeling framework

    International Nuclear Information System (INIS)

    Rana, H.A.; Shamsuddin, S.M.; Sunar, M.H.

    2008-01-01

    CG (Computer Graphics) is a key technology for producing visual contents. Currently computer generated imagery techniques are being developed and applied, particularly in the field of virtual reality applications, film production, training and flight simulators, to provide total composition of realistic computer graphic images. Natural objects like clouds are an integral feature of the sky without them synthetic outdoor scenes seem unrealistic. Modeling and animating such objects is a difficult task. Most systems are difficult to use, as they require adjustment of numerous, complex parameters and are non-interactive. This paper presents an intuitive, interactive system to artistically model, animate, and render visually convincing clouds using modern graphics hardware. A high-level interface models clouds through the visual use of cubes. Clouds are rendered by making use of hardware accelerated API -OpenGL. The resulting interactive design and rendering system produces perceptually convincing cloud models that can be used in any interactive system. (author)

  17. A Community Framework for Integrative, Coupled Modeling of Human-Earth Systems

    Science.gov (United States)

    Barton, C. M.; Nelson, G. C.; Tucker, G. E.; Lee, A.; Porter, C.; Ullah, I.; Hutton, E.; Hoogenboom, G.; Rogers, K. G.; Pritchard, C.

    2017-12-01

    We live today in a humanized world, where critical zone dynamics are driven by coupled human and biophysical processes. First generation modeling platforms have been invaluable in providing insight into dynamics of biophysical systems and social systems. But to understand today's humanized planet scientifically and to manage it sustainably, we need integrative modeling of this coupled human-Earth system. To address both scientific and policy questions, we also need modeling that can represent variable combinations of human-Earth system processes at multiple scales. Simply adding more code needed to do this to large, legacy first generation models is impractical, expensive, and will make them even more difficult to evaluate or understand. We need an approach to modeling that mirrors and benefits from the architecture of the complexly coupled systems we hope to model. Building on a series of international workshops over the past two years, we present a community framework to enable and support an ecosystem of diverse models as components that can be interconnected as needed to facilitate understanding of a range of complex human-earth systems interactions. Models are containerized in Docker to make them platform independent. A Basic Modeling Interface and Standard Names ontology (developed by the Community Surface Dynamics Modeling System) is applied to make them interoperable. They are then transformed into RESTful micro-services to allow them to be connected and run in a browser environment. This enables a flexible, multi-scale modeling environment to help address diverse issues with combinations of smaller, focused, component models that are easier to understand and evaluate. We plan to develop, deploy, and maintain this framework for integrated, coupled modeling in an open-source collaborative development environment that can democratize access to advanced technology and benefit from diverse global participation in model development. We also present an initial

  18. A checklist for identifying determinants of practice: A systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice

    Science.gov (United States)

    2013-01-01

    Background Determinants of practice are factors that might prevent or enable improvements. Several checklists, frameworks, taxonomies, and classifications of determinants of healthcare professional practice have been published. In this paper, we describe the development of a comprehensive, integrated checklist of determinants of practice (the TICD checklist). Methods We performed a systematic review of frameworks of determinants of practice followed by a consensus process. We searched electronic databases and screened the reference lists of key background documents. Two authors independently assessed titles and abstracts, and potentially relevant full text articles. We compiled a list of attributes that a checklist should have: comprehensiveness, relevance, applicability, simplicity, logic, clarity, usability, suitability, and usefulness. We assessed included articles using these criteria and collected information about the theory, model, or logic underlying how the factors (determinants) were selected, described, and grouped, the strengths and weaknesses of the checklist, and the determinants and the domains in each checklist. We drafted a preliminary checklist based on an aggregated list of determinants from the included checklists, and finalized the checklist by a consensus process among implementation researchers. Results We screened 5,778 titles and abstracts and retrieved 87 potentially relevant papers in full text. Several of these papers had references to papers that we also retrieved in full text. We also checked potentially relevant papers we had on file that were not retrieved by the searches. We included 12 checklists. None of these were completely comprehensive when compared to the aggregated list of determinants and domains. We developed a checklist with 57 potential determinants of practice grouped in seven domains: guideline factors, individual health professional factors, patient factors, professional interactions, incentives and resources

  19. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient...

  20. Metal–Organic Framework Thin Films as Platforms for Atomic Layer Deposition of Cobalt Ions To Enable Electrocatalytic Water Oxidation

    Energy Technology Data Exchange (ETDEWEB)

    Kung, Chung-Wei; Mondloch, Joseph E.; Wang, Timothy C.; Bury, Wojciech; Hoffeditz, William; Klahr, Benjamin M.; Klet, Rachel C.; Pellin, Michael J.; Farha, Omar K.; Hupp, Joseph T.

    2015-12-30

    Thin films of the metal organic framework (MOP) NU-1000 were grown on conducting glass substrates. The films uniformly cover the conducting glass substrates and are composed of free-standing sub-micrometer rods. Subsequently, atomic layer deposition (ALD) was utilized to deposit Co2+ ions throughout the entire MOF film via self-limiting surface-mediated reaction chemistry. The Co ions bind at aqua and hydroxo sites lining the channels of NU-1000, resulting in three-dimensional arrays of separated Co ions in the MOF thin film. The Co-modified MOF thin films demonstrate promising electrocatalytic activity for water oxidation.

  1. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  2. A Conceptual Framework of Business Model Emerging Resilience

    OpenAIRE

    Goumagias, Nikolaos; Fernandes, Kiran; Cabras, Ignazio; Li, Feng; Shao, Jianhua; Devlin, Sam; Hodge, Victoria; Cowling, Peter; Kudenko, Daniel

    2016-01-01

    In this paper we introduce an environmentally driven conceptual framework of Business Model change. Business models acquired substantial momentum in academic literature during the past decade. Several studies focused on what exactly constitutes a Business Model (role model, recipe, architecture etc.) triggering a theoretical debate about the Business Model’s components and their corresponding dynamics and relationships. In this paper, we argue that for Business Models as cognitive structures,...

  3. A modelling framework for MSP-oriented cumulative effects assessment

    OpenAIRE

    Stefano Menegon; Daniel Depellegrin; Giulio Farella; Elena Gissi; Michol Ghezzo; Alessandro Sarretta; Chiara Venier; Andrea Barbanti

    2018-01-01

    This research presents a comprehensive Cumulative Eects Assessment (CEA) based on the Tools4MSP modelling framework tested for the Italian Adriatic Sea. The CEA incorporates ve methodological advancements: (1) linear and non-linear ecosystem response to anthropogenic pressures/effects, (2) modelling of additive, dominant and antagonist stressor effects, (3) implementation of a convolution distance model for stressor dispersion modelling, (4) application of a CEA backsourcing (CEA-B) model to ...

  4. A general modeling framework for describing spatially structured population dynamics

    Science.gov (United States)

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance

  5. A general modeling framework for describing spatially structured population dynamics.

    Science.gov (United States)

    Sample, Christine; Fryxell, John M; Bieri, Joanna A; Federico, Paula; Earl, Julia E; Wiederholt, Ruscena; Mattsson, Brady J; Flockhart, D T Tyler; Nicol, Sam; Diffendorfer, Jay E; Thogmartin, Wayne E; Erickson, Richard A; Norris, D Ryan

    2018-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance

  6. A Framework for Hybrid Computational Models

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    2003-01-01

    Roč. 2, č. 4 (2003), s. 868-873 ISSN 1109-2750 R&D Projects: GA ČR(CZ) GA526/03/Z042; GA ČR(CZ) GA201/01/1192 Institutional research plan: CEZ:AV0Z1030915 Keywords : multi-agent systems * hybrid computational models Subject RIV: BA - General Mathematics

  7. Cytoview: Development of a cell modelling framework

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    The fundamental unit of living tissue, in fact of life itself, is the biological cell. Currently there is enormous interest in in silico modelling of the cell .... classification and cell type relationships, newer vocabulary is required to describe a single cell itself with all its sub- cellular structures. Further, this vocabulary should pave way.

  8. A Unified Bayesian Inference Framework for Generalized Linear Models

    Science.gov (United States)

    Meng, Xiangming; Wu, Sheng; Zhu, Jiang

    2018-03-01

    In this letter, we present a unified Bayesian inference framework for generalized linear models (GLM) which iteratively reduces the GLM problem to a sequence of standard linear model (SLM) problems. This framework provides new perspectives on some established GLM algorithms derived from SLM ones and also suggests novel extensions for some other SLM algorithms. Specific instances elucidated under such framework are the GLM versions of approximate message passing (AMP), vector AMP (VAMP), and sparse Bayesian learning (SBL). It is proved that the resultant GLM version of AMP is equivalent to the well-known generalized approximate message passing (GAMP). Numerical results for 1-bit quantized compressed sensing (CS) demonstrate the effectiveness of this unified framework.

  9. An Integrated Framework to Specify Domain-Specific Modeling Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2018-01-01

    In this paper, we propose an integrated framework that can be used by DSL designers to implement their desired graphical domain-specific languages. This framework relies on Microsoft DSL Tools, a meta-modeling framework to build graphical domain-specific languages, and an extension of ForSpec, a ...... language to define their semantics. Integrating these technologies under the umbrella of Microsoft Visual Studio IDE allows DSL designers to utilize a single development environment for developing their desired domain-specific languages....

  10. A compositional modelling framework for exploring MPSoC systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan

    2009-01-01

    This paper presents a novel compositional framework for system level performance estimation and exploration of Multi-Processor System On Chip (MPSoC) based systems. The main contributions are the definition of a compositional model which allows quantitative performance estimation to be carried out......-exist and communicate. In order to illustrate the use of the framework, a mobile digital audio processing platform, supplied by the company Bang & Olufsen ICEpower a/s, is considered....

  11. A Framework for PSS Business Models: Formalization and Application

    OpenAIRE

    Adrodegari, Federico; Saccani, Nicola; Kowalkowski, Christian

    2016-01-01

    In order to successfully move "from products to solutions", companies need to redesign their business model. Nevertheless, service oriented BMs in product-centric firms are under-investigated in the literature: very few works develop a scheme of analysis of such BMs. To provide a first step into closing this gap, we propose a new framework to describe service-oriented BMs, pointing out the main BM components and related PSS characteristics. Thus, the proposed framework aims to help companies ...

  12. Multicriteria framework for selecting a process modelling language

    Science.gov (United States)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  13. A Framework for Federated Two-Factor Authentication Enabling Cost-Effective Secure Access to Distributed Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Ezell, Matthew A [ORNL; Rogers, Gary L [University of Tennessee, Knoxville (UTK); Peterson, Gregory D. [University of Tennessee, Knoxville (UTK)

    2012-01-01

    As cyber attacks become increasingly sophisticated, the security measures used to mitigate the risks must also increase in sophistication. One time password (OTP) systems provide strong authentication because security credentials are not reusable, thus thwarting credential replay attacks. The credential changes regularly, making brute-force attacks significantly more difficult. In high performance computing, end users may require access to resources housed at several different service provider locations. The ability to share a strong token between multiple computing resources reduces cost and complexity. The National Science Foundation (NSF) Extreme Science and Engineering Discovery Environment (XSEDE) provides access to digital resources, including supercomputers, data resources, and software tools. XSEDE will offer centralized strong authentication for services amongst service providers that leverage their own user databases and security profiles. This work implements a scalable framework built on standards to provide federated secure access to distributed cyberinfrastructure.

  14. Endosomal Escape and Delivery of CRISPR/Cas9 Genome Editing Machinery Enabled by Nanoscale Zeolitic Imidazolate Framework

    KAUST Repository

    Alsaiari, Shahad K.

    2017-12-22

    CRISPR/Cas9 is a combined protein (Cas9) and an engineered single guide RNA (sgRNA) genome editing platform that offers revolutionary solutions to genetic diseases. It has, however, a double delivery problem owning to the large protein size and the highly charged RNA component. In this work, we report the first example of CRISPR/Cas9 encapsulated by nanoscale zeolitic imidazole frameworks (ZIFs) with a loading efficiency of 17% and enhanced endosomal escape promoted by the protonated imidazole moieties. The gene editing potential of CRISPR/Cas9 encapsulated by ZIF-8 (CC-ZIFs) is further verified by knocking down the gene expression of green fluorescent protein by 37% over 4 days employing CRISPR/Cas9 machinery. The nanoscale CC-ZIFs are biocompatible and easily scaled-up offering excellent loading capacity and controlled co-delivery of intact Cas9 protein and sgRNA.

  15. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    Science.gov (United States)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  16. A general model framework for multisymbol number comparison.

    Science.gov (United States)

    Huber, Stefan; Nuerk, Hans-Christoph; Willmes, Klaus; Moeller, Korbinian

    2016-11-01

    Different models have been proposed for the processing of multisymbol numbers like two- and three-digit numbers but also for negative numbers and decimals. However, these multisymbol numbers are assembled from the same set of Arabic digits and comply with the place-value structure of the Arabic number system. Considering these shared properties, we suggest that the processing of multisymbol numbers can be described in one general model framework. Accordingly, we first developed a computational model framework realizing componential representations of multisymbol numbers and evaluated its validity by simulating standard empirical effects of number magnitude comparison. We observed that the model framework successfully accounted for most of these effects. Moreover, our simulations provided first evidence supporting the notion of a fully componential processing of multisymbol numbers for the specific case of comparing two negative numbers. Thus, our general model framework indicates that the processing of different kinds of multisymbol integer and decimal numbers shares common characteristics (e.g., componential representation). The relevance and applicability of our model goes beyond the case of basic number processing. In particular, we also successfully simulated effects from applied marketing and consumer research by accounting for the left-digit effect found in processing of prices. Finally, we provide evidence that our model framework can be integrated into the more general context of multiattribute decision making. In sum, this indicates that our model framework captures a general scheme of separate processing of different attributes weighted by their saliency for the task at hand. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Compendium of Models from a Gauge U(1) Framework

    OpenAIRE

    Ma, Ernest

    2016-01-01

    A gauge U(1) framework was established in 2002 to extend the supersymmetric standard model. It has many possible realizations. Whereas all have the necessary and sufficient ingredients to explain the possible 750 GeV diphoton excess, observed recently by the ATLAS Collaboration at the Large Hadron Collider (LHC), they differ in other essential aspects. A compendium of such models is discussed.

  18. A community-based framework for aquatic ecosystem models

    NARCIS (Netherlands)

    Trolle, D.; Hamilton, D.P.; Hipsey, M.R.; Bolding, K.; Bruggeman, J.; Mooij, W.M.; Janse, J.H.; Nielsen, A.; Jeppesen, E.; Elliot, J.A.; Makler-Pick, V.; Petzoldt, T.; Rinke, K.; Flindt, M.R.; Arhonditsis, G.; Gal, G.; Bjerring, R.; Tominaga, K.; 't Hoen, J.; Downing, A.S.; Marques, D.M.; Fragoso Jr., C.R.; Søndergaard, M.; Hanson, P.C.

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through a

  19. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Karali, Nihan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  20. A framework for quantifying net benefits of alternative prognostic models

    NARCIS (Netherlands)

    Rapsomaniki, E.; White, I.R.; Wood, A.M.; Thompson, S.G.; Feskens, E.J.M.; Kromhout, D.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit)

  1. A framework for development and application of hydrological models

    Directory of Open Access Journals (Sweden)

    T. Wagener

    2001-01-01

    Full Text Available Many existing hydrological modelling procedures do not make best use of available information, resulting in non-minimal uncertainties in model structure and parameters, and a lack of detailed information regarding model behaviour. A framework is required that balances the level of model complexity supported by the available data with the level of performance suitable for the desired application. Tools are needed that make optimal use of the information available in the data to identify model structure and parameters, and that allow a detailed analysis of model behaviour. This should result in appropriate levels of model complexity as a function of available data, hydrological system characteristics and modelling purpose. This paper introduces an analytical framework to achieve this, and tools to use within it, based on a multi-objective approach to model calibration and analysis. The utility of the framework is demonstrated with an example from the field of rainfall-runoff modelling. Keywords: hydrological modelling, multi-objective calibration, model complexity, parameter identifiability

  2. Enabling new graduate midwives to work in midwifery continuity of care models: A conceptual model for implementation.

    Science.gov (United States)

    Cummins, Allison M; Catling, Christine; Homer, Caroline S E

    2017-12-04

    High-level evidence demonstrates midwifery continuity of care is beneficial for women and babies. Women have limited access to midwifery continuity of care models in Australia. One of the factors limiting women's access is recruiting enough midwives to work in continuity. Our research found that newly graduated midwives felt well prepared to work in midwifery led continuity of care models, were well supported to work in the models and the main driver to employing them was a need to staff the models. However limited opportunities exist for new graduate midwives to work in midwifery continuity of care. The aim of this paper therefore is to describe a conceptual model developed to enable new graduate midwives to work in midwifery continuity of care models. The findings from a qualitative study were synthesised with the existing literature to develop a conceptual model that enables new graduate midwives to work in midwifery continuity of care. The model contains the essential elements to enable new graduate midwives to work in midwifery continuity of care models. Each of the essential elements discussed are to assist midwifery managers, educators and new graduates to facilitate the organisational changes required to accommodate new graduates. The conceptual model is useful to show maternity services how to enable new graduate midwives to work in midwifery continuity of care models. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  3. CARTOGRAPHY ENABLING COMMUNICATION AND DECISIONMAKING IN SUSTAINABILITY ISSUES (ECONOMIC, SOCIAL, ENVIRONMENTAL OF TRANSNATIONAL DECLARATIONS, CONVENTIONS, TREATIES, FRAMEWORKS AND DIRECTIVES

    Directory of Open Access Journals (Sweden)

    H. Kremers

    2017-01-01

    Full Text Available The role of cartography in multi-national or global programs [https://en.wikipedia.org/wiki/Category:United_Nations_treaties] operationalization is gaining importance for practical success at an increasing speed. The paradigm change consists mainly in the fact that it is not the final stage of visualization of facts in a sequence of digital information product generation that is the focus of cartographic competence, but instead, communication issues and corresponding decision support is immanent in the complexity of information management in all stages (Strategic Structure, Actor-Specific Requirement Analysis, Specification and System Design, Information Flow and Implementation of Active Processes, Goal Reaching Control and Recursive Guidance. Thus, Cartography is seen as the key information science discipline that enables decision making and goal-reaching control at all levels and stages in the tasks mentioned.The cartography and geoinformation challenges of massive inter-organizational cooperation are in the adequate highly complex information components operation and real-life application in sustainability enforcement. The main strategic domains to be investigated, developed and implemented are Interoperability and Infrastructures, Analysis for Decision Support, Applied Semiotics, Situation Dynamics and Standards. It is shown that the current inherent information management deficits can be avoided to a high degree by applying and adjusting cartographic methods and technologies for use in the appropriate complexity domains of facts, actors, decisions and actions.

  4. A software engineering perspective on environmental modeling framework design: The object modeling system

    Science.gov (United States)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  5. MALDI-TOF-MS with PLS Modeling Enables Strain Typing of the Bacterial Plant Pathogen Xanthomonas axonopodis

    Science.gov (United States)

    Sindt, Nathan M.; Robison, Faith; Brick, Mark A.; Schwartz, Howard F.; Heuberger, Adam L.; Prenni, Jessica E.

    2017-11-01

    Matrix-assisted desorption/ionization time of flight mass spectrometry (MALDI-TOF-MS) is a fast and effective tool for microbial species identification. However, current approaches are limited to species-level identification even when genetic differences are known. Here, we present a novel workflow that applies the statistical method of partial least squares discriminant analysis (PLS-DA) to MALDI-TOF-MS protein fingerprint data of Xanthomonas axonopodis, an important bacterial plant pathogen of fruit and vegetable crops. Mass spectra of 32 X. axonopodis strains were used to create a mass spectral library and PLS-DA was employed to model the closely related strains. A robust workflow was designed to optimize the PLS-DA model by assessing the model performance over a range of signal-to-noise ratios (s/n) and mass filter (MF) thresholds. The optimized parameters were observed to be s/n = 3 and MF = 0.7. The model correctly classified 83% of spectra withheld from the model as a test set. A new decision rule was developed, termed the rolled-up Maximum Decision Rule (ruMDR), and this method improved identification rates to 92%. These results demonstrate that MALDI-TOF-MS protein fingerprints of bacterial isolates can be utilized to enable identification at the strain level. Furthermore, the open-source framework of this workflow allows for broad implementation across various instrument platforms as well as integration with alternative modeling and classification algorithms. [Figure not available: see fulltext.

  6. A Modeling & Simulation Implementation Framework for Large-Scale Simulation

    Directory of Open Access Journals (Sweden)

    Song Xiao

    2012-10-01

    Full Text Available Classical High Level Architecture (HLA systems are facing development problems for lack of supporting fine-grained component integration and interoperation in large-scale complex simulation applications. To provide efficient methods of this issue, an extensible, reusable and composable simulation framework is proposed. To promote the reusability from coarse-grained federate to fine-grained components, this paper proposes a modelling & simulation framework which consists of component-based architecture, modelling methods, and simulation services to support and simplify the process of complex simulation application construction. Moreover, a standard process and simulation tools are developed to ensure the rapid and effective development of simulation application.

  7. Population balance models: a useful complementary modelling framework for future WWTP modelling

    DEFF Research Database (Denmark)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel

    2015-01-01

    efforts of several current and future unit processes in wastewater treatment plants could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot...... capability. Hence, PBMs should be regarded as a complementary modelling framework to biokinetic models. This paper provides an overview of current applications, future potential and limitations of PBMs in the field of wastewater treatment modelling, thereby looking over the fence to other scientific...

  8. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.

    2007-01-01

    is to give a structured, convenient approach for building threat models. A framework for the threat model is presented with a list of requirements for methodology. The methodology will be applied to build a threat model for Personal Networks. Practical tools like UML sequence diagrams and attack trees have...... been used. Also risk assessment methods will be discussed. Threat profiles and vulnerability profiles have been presented....

  9. Application of a conceptual framework for the modelling and execution of clinical guidelines as networks of concurrent processes

    NARCIS (Netherlands)

    Fung, L.S.N.; Fung, Nick Lik San; Widya, I.A.; Broens, T.H.F.; Larburu Rubio, Nekane; Bults, Richard G.A.; Shalom, Erez; Jones, Valerie M.; Hermens, Hermanus J.

    2014-01-01

    We present a conceptual framework for modelling clinical guidelines as networks of concurrent processes. This enables the guideline to be partitioned and distributed at run-time across a knowledge-based telemedicine system, which is distributed by definition but whose exact physical configuration

  10. Collaborative Cloud Manufacturing: Design of Business Model Innovations Enabled by Cyberphysical Systems in Distributed Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Erwin Rauch

    2016-01-01

    Full Text Available Collaborative cloud manufacturing, as a concept of distributed manufacturing, allows different opportunities for changing the logic of generating and capturing value. Cyberphysical systems and the technologies behind them are the enablers for new business models which have the potential to be disruptive. This paper introduces the topics of distributed manufacturing as well as cyberphysical systems. Furthermore, the main business model clusters of distributed manufacturing systems are described, including collaborative cloud manufacturing. The paper aims to provide support for developing business model innovations based on collaborative cloud manufacturing. Therefore, three business model architecture types of a differentiated business logic are discussed, taking into consideration the parameters which have an influence and the design of the business model and its architecture. As a result, new business models can be developed systematically and new ideas can be generated to boost the concept of collaborative cloud manufacturing within all sustainable business models.

  11. Prediction of hourly solar radiation with multi-model framework

    International Nuclear Information System (INIS)

    Wu, Ji; Chan, Chee Keong

    2013-01-01

    Highlights: • A novel approach to predict solar radiation through the use of clustering paradigms. • Development of prediction models based on the intrinsic pattern observed in each cluster. • Prediction based on proper clustering and selection of model on current time provides better results than other methods. • Experiments were conducted on actual solar radiation data obtained from a weather station in Singapore. - Abstract: In this paper, a novel multi-model prediction framework for prediction of solar radiation is proposed. The framework started with the assumption that there are several patterns embedded in the solar radiation series. To extract the underlying pattern, the solar radiation series is first segmented into smaller subsequences, and the subsequences are further grouped into different clusters. For each cluster, an appropriate prediction model is trained. Hence a procedure for pattern identification is developed to identify the proper pattern that fits the current period. Based on this pattern, the corresponding prediction model is applied to obtain the prediction value. The prediction result of the proposed framework is then compared to other techniques. It is shown that the proposed framework provides superior performance as compared to others

  12. Annotation of rule-based models with formal semantics to enable creation, analysis, reuse and visualization.

    Science.gov (United States)

    Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil

    2016-03-15

    Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. The annotation ontology for rule-based models can be found at http://purl.org/rbm/rbmo The krdf tool and associated executable examples are

  13. Annotation of rule-based models with formal semantics to enable creation, analysis, reuse and visualization

    Science.gov (United States)

    Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil

    2016-01-01

    Motivation: Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. Results: We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. Availability and implementation: The annotation ontology for rule-based models can be found at http

  14. LAMMPS Framework for Dynamic Bonding and an Application Modeling DNA

    DEFF Research Database (Denmark)

    Svaneborg, Carsten

    2012-01-01

    and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework.......We have extended the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) to support directional bonds and dynamic bonding. The framework supports stochastic formation of new bonds, breakage of existing bonds, and conversion between bond types. Bond formation can be controlled...... to limit the maximal functionality of a bead with respect to various bond types. Concomitant with the bond dynamics, angular and dihedral interactions are dynamically introduced between newly connected triplets and quartets of beads, where the interaction type is determined from the local pattern of bead...

  15. A Liver-Centric Multiscale Modeling Framework for Xenobiotics.

    Directory of Open Access Journals (Sweden)

    James P Sluka

    Full Text Available We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics.

  16. A Framework for Conceptual Modeling of Geographic Data Quality

    DEFF Research Database (Denmark)

    Friis-Christensen, Anders; Christensen, J.V.; Jensen, Christian Søndergaard

    2004-01-01

    Sustained advances in wireless communications, geo-positioning, and consumer electronics pave the way to a kind of location-based service that relies on the tracking of the continuously changing positions of an entire population of service users. This type of service is characterized by large...... of geographic data and quality. The approach integrates quality information with the basic model constructs. This results in a model that enables object-oriented specification of quality requirements and of acceptable quality levels. More specifically, it extends the Unified Modeling Language with new modeling...... constructs based on standard classes, attributes, and associations that include quality information. A case study illustrates the utility of the quality-enabled model. reported....

  17. Formulation, construction and analysis of kinetic models of metabolism: A review of modelling frameworks

    DEFF Research Database (Denmark)

    Saa, Pedro A.; Nielsen, Lars K.

    2017-01-01

    Kinetic models are critical to predict the dynamic behaviour of metabolic networks. Mechanistic kinetic models for large networks remain uncommon due to the difficulty of fitting their parameters. Recent modelling frameworks promise new ways to overcome this obstacle while retaining predictive...... capabilities. In this review, we present an overview of the relevant mathematical frameworks for kinetic formulation, construction and analysis. Starting with kinetic formalisms, we next review statistical methods for parameter inference, as well as recent computational frameworks applied to the construction...

  18. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  19. Business Modeling Framework For Personalization In Mobile Business Services

    NARCIS (Netherlands)

    L-F. Pau (Louis-François); J. Dits (Joyce)

    2002-01-01

    textabstractIs presented the structure of a formal framework for personalization features for mobile business services, which can be used to drive the business modeling of M-business services from a service provider point of view. It also allows to compute the revenue as linked to personalization

  20. Model-based safety architecture framework for complex systems

    NARCIS (Netherlands)

    Schuitemaker, Katja; Rajabali Nejad, Mohammadreza; Braakhuis, J.G.; Podofillini, Luca; Sudret, Bruno; Stojadinovic, Bozidar; Zio, Enrico; Kröger, Wolfgang

    2015-01-01

    The shift to transparency and rising need of the general public for safety, together with the increasing complexity and interdisciplinarity of modern safety-critical Systems of Systems (SoS) have resulted in a Model-Based Safety Architecture Framework (MBSAF) for capturing and sharing architectural

  1. Service business model framework and the service innovation scope

    NARCIS (Netherlands)

    van der Aa, W.; van der Rhee, B.; Victorino, L.

    2011-01-01

    In this paper we present a framework for service business models. We build on three streams of research. The first stream is the service management and marketing literature that focuses on the specific challenges of managing a service business. The second stream consists of research on e-business

  2. A Graph Based Framework to Model Virus Integration Sites

    Directory of Open Access Journals (Sweden)

    Raffaele Fronza

    2016-01-01

    Here, we addressed the challenge to: 1 define the notion of CIS on graph models, 2 demonstrate that the structure of CIS enters in the category of scale-free networks and 3 show that our network approach analyzes CIS dynamically in an integrated systems biology framework using the Retroviral Transposon Tagged Cancer Gene Database (RTCGD as a testing dataset.

  3. A Liver-centric Multiscale Modeling Framework for Xenobiotics

    Science.gov (United States)

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study foc...

  4. Application of a stochastic modelling framework to characterize the ...

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana; Volume 36; Issue 4. Application of a stochastic modelling framework to characterize the influence of different oxide scales on the solid particle erosion behaviour of boiler grade steel. S K Das. Volume 36 Issue 4 August 2011 pp 425-440 ...

  5. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  6. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  7. A Systematic Modelling Framework for Phase Transfer Catalyst Systems

    DEFF Research Database (Denmark)

    Anantpinijwatna, Amata; Sales-Cruz, Mauricio; Hyung Kim, Sun

    2016-01-01

    equilibria, as well as kinetic mechanisms and rates. This paper presents a modelling framework for design and analysis of PTC systems that requires a minimum amount of experimental data to develop and employ the necessary thermodynamic and reaction models and embeds them into a reactor model for simulation...... in an aqueous phase. These reacting systems are receiving increased attention as novel organic synthesis options due to their flexible operation, higher product yields, and ability to avoid hazardous or expensive solvents. Major considerations in the design and analysis of PTC systems are physical and chemical....... The application of the framework is made to two cases in order to highlight the performance and issues of activity coefficient models for predicting design and operation and the effects when different organic solvents are employed....

  8. Enabling intelligent copernicus services for carbon and water balance modeling of boreal forest ecosystems - North State

    Science.gov (United States)

    Häme, Tuomas; Mutanen, Teemu; Rauste, Yrjö; Antropov, Oleg; Molinier, Matthieu; Quegan, Shaun; Kantzas, Euripides; Mäkelä, Annikki; Minunno, Francesco; Atli Benediktsson, Jon; Falco, Nicola; Arnason, Kolbeinn; Storvold, Rune; Haarpaintner, Jörg; Elsakov, Vladimir; Rasinmäki, Jussi

    2015-04-01

    The objective of project North State, funded by Framework Program 7 of the European Union, is to develop innovative data fusion methods that exploit the new generation of multi-source data from Sentinels and other satellites in an intelligent, self-learning framework. The remote sensing outputs are interfaced with state-of-the-art carbon and water flux models for monitoring the fluxes over boreal Europe to reduce current large uncertainties. This will provide a paradigm for the development of products for future Copernicus services. The models to be interfaced are a dynamic vegetation model and a light use efficiency model. We have identified four groups of variables that will be estimated with remote sensed data: land cover variables, forest characteristics, vegetation activity, and hydrological variables. The estimates will be used as model inputs and to validate the model outputs. The earth observation variables are computed as automatically as possible, with an objective to completely automatic estimation. North State has two sites for intensive studies in southern and northern Finland, respectively, one in Iceland and one in state Komi of Russia. Additionally, the model input variables will be estimated and models applied over European boreal and sub-arctic region from Ural Mountains to Iceland. The accuracy assessment of the earth observation variables will follow statistical sampling design. Model output predictions are compared to earth observation variables. Also flux tower measurements are applied in the model assessment. In the paper, results of hyperspectral, Sentinel-1, and Landsat data and their use in the models is presented. Also an example of a completely automatic land cover class prediction is reported.

  9. A framework for quantifying net benefits of alternative prognostic models

    OpenAIRE

    Rapsomaniki, E.; White, I.R.; Wood, A.M.; Thompson, S.G.; Ford, I.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measure...

  10. A Practical Ontology Framework for Static Model Analysis

    Science.gov (United States)

    2011-04-26

    throughout the model. We implement our analysis framework on top of Ptolemy II [3], an extensible open source model-based design tool written in Java...While Ptolemy II makes a good testbed for im- plementing and experimenting with new analyses, we also feel that the techniques we present here are...broadly use- ful. For this reason, we aim to make our analysis frame- work orthogonal to the execution semantics of Ptolemy II, allowing it to be

  11. COINSTAC: A privacy enabled model and prototype for leveraging and processing decentralized brain imaging data

    Directory of Open Access Journals (Sweden)

    Sergey M Plis

    2016-08-01

    Full Text Available The field of neuroimaging has embraced the need for sharing and collaboration. Data sharing mandates from public funding agencies and major journal publishers have spurred the development of data repositories and neuroinformatics consortia. However, efficient and effective data sharing still faces several hurdles. For example, open data sharing is on the rise but is not suitable for sensitive data that are not easily shared, such as genetics. Current approaches can be cumbersome (such as negotiating multiple data sharing agreements. There are also significant data transfer, organization and computational challenges. Centralized repositories only partially address the issues. We propose a dynamic, decentralized platform for large scale analyses called the Collaborative Informatics and Neuroimaging Suite Toolkit for Anonymous Computation (COINSTAC. The COINSTAC solution can include data missing from central repositories, allows pooling of both open and ``closed'' repositories by developing privacy-preserving versions of widely-used algorithms, and incorporates the tools within an easy-to-use platform enabling distributed computation. We present an initial prototype system which we demonstrate on two multi-site data sets, without aggregating the data. In addition, by iterating across sites, the COINSTAC model enables meta-analytic solutions to converge to ``pooled-data'' solutions (i.e. as if the entire data were in hand. More advanced approaches such as feature generation, matrix factorization models, and preprocessing can be incorporated into such a model. In sum, COINSTAC enables access to the many currently unavailable data sets, a user friendly privacy enabled interface for decentralized analysis, and a powerful solution that complements existing data sharing solutions.

  12. Advancing Integrated Systems Modelling Framework for Life Cycle Sustainability Assessment

    Directory of Open Access Journals (Sweden)

    Anthony Halog

    2011-02-01

    Full Text Available The need for integrated methodological framework for sustainability assessment has been widely discussed and is urgent due to increasingly complex environmental system problems. These problems have impacts on ecosystems and human well-being which represent a threat to economic performance of countries and corporations. Integrated assessment crosses issues; spans spatial and temporal scales; looks forward and backward; and incorporates multi-stakeholder inputs. This study aims to develop an integrated methodology by capitalizing the complementary strengths of different methods used by industrial ecologists and biophysical economists. The computational methodology proposed here is systems perspective, integrative, and holistic approach for sustainability assessment which attempts to link basic science and technology to policy formulation. The framework adopts life cycle thinking methods—LCA, LCC, and SLCA; stakeholders analysis supported by multi-criteria decision analysis (MCDA; and dynamic system modelling. Following Pareto principle, the critical sustainability criteria, indicators and metrics (i.e., hotspots can be identified and further modelled using system dynamics or agent based modelling and improved by data envelopment analysis (DEA and sustainability network theory (SNT. The framework is being applied to development of biofuel supply chain networks. The framework can provide new ways of integrating knowledge across the divides between social and natural sciences as well as between critical and problem-solving research.

  13. On New Cautious Structural Reliability Models in the Framework of imprecise Probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev V.; Kozine, Igor

    2010-01-01

    Uncertainty of parameters in engineering design has been modeled in different frameworks such as inter-val analysis, fuzzy set and possibility theories, ran-dom set theory and imprecise probability theory. The authors of this paper for many years have been de-veloping new imprecise reliability......? Developing models enabling to answer these two questions has been in the focus of the new research the results of which are described in the paper. In this paper we describe new models for com-puting structural reliability based on measurements of values of stress and strength and taking account of the fact...... that the number of observations may be ra-ther small. The approach to developing the models is based on using the imprecise Bayesian inference models (Walley 1996). These models provide a rich supply of coherent imprecise inferences that are ex-pressed in terms of posterior upper and lower prob...

  14. Possibilities: A framework for modeling students' deductive reasoning in physics

    Science.gov (United States)

    Gaffney, Jonathan David Housley

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the

  15. A new fit-for-purpose model testing framework: Decision Crash Tests

    Science.gov (United States)

    Tolson, Bryan; Craig, James

    2016-04-01

    decisions. In one case, we show the set of model building decisions has a low probability to correctly support the upgrade decision. In the other case, we show evidence suggesting another set of model building decisions has a high probability to correctly support the decision. The proposed DCT framework focuses on what model users typically care about: the management decision in question. The DCT framework will often be very strict and will produce easy to interpret results enabling clear unsuitability determinations. In the past, hydrologic modelling progress has necessarily meant new models and model building methods. Continued progress in hydrologic modelling requires finding clear evidence to motivate researchers to disregard unproductive models and methods and the DCT framework is built to produce this kind of evidence. References: Andréassian, V., C. Perrin, L. Berthet, N. Le Moine, J. Lerat, C. Loumagne, L. Oudin, T. Mathevet, M.-H. Ramos, and A. Valéry (2009), Crash tests for a standardized evaluation of hydrological models. Hydrology and Earth System Sciences, 13, 1757-1764. Klemeš, V. (1986), Operational testing of hydrological simulation models. Hydrological Sciences Journal, 31 (1), 13-24.

  16. Re-orienting a remote acute care model towards a primary health care approach: key enablers.

    Science.gov (United States)

    Carroll, Vicki; Reeve, Carole A; Humphreys, John S; Wakerman, John; Carter, Maureen

    2015-01-01

    The objective of this study was to identify the key enablers of change in re-orienting a remote acute care model to comprehensive primary healthcare delivery. The setting of the study was a 12-bed hospital in Fitzroy Crossing, Western Australia. Individual key informant, in-depth interviews were completed with five of six identified senior leaders involved in the development of the Fitzroy Valley Health Partnership. Interviews were recorded and transcripts were thematically analysed by two investigators for shared views about the enabling factors strengthening primary healthcare delivery in a remote region of Australia. Participants described theestablishment of a culturally relevant primary healthcare service, using a community-driven, 'bottom up' approach characterised by extensive community participation. The formal partnership across the government and community controlled health services was essential, both to enable change to occur and to provide sustainability in the longer term. A hierarchy of major themes emerged. These included community participation, community readiness and desire for self-determination; linkages in the form of a government community controlled health service partnership; leadership; adequate infrastructure; enhanced workforce supply; supportive policy; and primary healthcare funding. The strong united leadership shown by the community and the health service enabled barriers to be overcome and it maximised the opportunities provided by government policy changes. The concurrent alignment around a common vision enabled implementation of change. The key principle learnt from this study is the importance of community and health service relationships and local leadership around a shared vision for the re-orientation of community health services.

  17. The development of a sustainable development model framework

    International Nuclear Information System (INIS)

    Hannoura, Alim P.; Cothren, Gianna M.; Khairy, Wael M.

    2006-01-01

    The emergence of the 'sustainable development' concept as a response to the mining of natural resources for the benefit of multinational corporations has advanced the cause of long-term environmental management. A sustainable development model (SDM) framework that is inclusive of the 'whole' natural environment is presented to illustrate the integration of the sustainable development of the 'whole' ecosystem. The ecosystem approach is an inclusive framework that covers the natural environment relevant futures and constraints. These are dynamically interconnected and constitute the determinates of resources development component of the SDM. The second component of the SDM framework is the resources development patterns, i.e., the use of land, water, and atmospheric resources. All of these patterns include practices that utilize environmental resources to achieve a predefined outcome producing waste and by-products that require disposal into the environment. The water quality management practices represent the third component of the framework. These practices are governed by standards, limitations and available disposal means subject to quantity and quality permits. These interconnected standards, practices and permits shape the resulting environmental quality of the ecosystem under consideration. A fourth component, environmental indicators, of the SDM framework provides a measure of the ecosystem productivity and status that may differ based on societal values and culture. The four components of the SDM are interwoven into an outcome assessment process to form the management and feedback models. The concept of Sustainable Development is expressed in the management model as an objective function subject to desired constraints imposing the required bounds for achieving ecosystem sustainability. The development of the objective function and constrains requires monetary values for ecosystem functions, resources development activities and environmental cost. The

  18. Enablers and inhibitors of the implementation of the Casalud Model, a Mexican innovative healthcare model for non-communicable disease prevention and control.

    Science.gov (United States)

    Tapia-Conyer, Roberto; Saucedo-Martinez, Rodrigo; Mujica-Rosales, Ricardo; Gallardo-Rincon, Hector; Campos-Rivera, Paola Abril; Lee, Evan; Waugh, Craig; Guajardo, Lucia; Torres-Beltran, Braulio; Quijano-Gonzalez, Ursula; Soni-Gallardo, Lidia

    2016-07-22

    The Mexican healthcare system is under increasing strain due to the rising prevalence of non-communicable diseases (especially type 2 diabetes), mounting costs, and a reactive curative approach focused on treating existing diseases and their complications rather than preventing them. Casalud is a comprehensive primary healthcare model that enables proactive prevention and disease management throughout the continuum of care, using innovative technologies and a patient-centred approach. Data were collected over a 2-year period in eight primary health clinics (PHCs) in two states in central Mexico to identify and assess enablers and inhibitors of the implementation process of Casalud. We used mixed quantitative and qualitative data collection tools: surveys, in-depth interviews, and participant and non-participant observations. Transcripts and field notes were analyzed and coded using Framework Analysis, focusing on defining and describing enablers and inhibitors of the implementation process. We identified seven recurring topics in the analyzed textual data. Four topics were categorized as enablers: political support for the Casalud model, alignment with current healthcare trends, ongoing technical improvements (to ease adoption and support), and capacity building. Three topics were categorized as inhibitors: administrative practices, health clinic human resources, and the lack of a shared vision of the model. Enablers are located at PHCs and across all levels of government, and include political support for, and the technological validity of, the model. The main inhibitor is the persistence of obsolete administrative practices at both state and PHC levels, which puts the administrative feasibility of the model's implementation in jeopardy. Constructing a shared vision around the model could facilitate the implementation of Casalud as well as circumvent administrative inhibitors. In order to overcome PHC-level barriers, it is crucial to have an efficient and

  19. Interpretive Structural Modeling Of Implementation Enablers For Just In Time In ICPI

    Directory of Open Access Journals (Sweden)

    Nitin Upadhye

    2014-12-01

    Full Text Available Indian Corrugated Packaging Industries (ICPI have built up tough competition among the industries in terms of product cost, quality, product delivery, flexibility, and finally customer’s demand. As their customers, mostly OEMs are asking Just in Time deliveries, ICPI must implement JIT in their system. The term "JIT” as, it denotes a system that utilizes less, in terms of all inputs, to create the same outputs as those created by a traditional mass production system, while contributing increased varieties for the end customer. (Womack et al. 1990 "JIT" focuses on abolishing or reducing Muda (“Muda", the Japanese word for waste and on maximizing or fully utilizing activities that add value from the customer's perspective. There is lack of awareness in identifying the right enablers of JIT implementation. Therefore, this study has tried to find out the enablers from the literature review and expert’s opinions from corrugated packaging industries and developed the relationship matrix to see the driving power and dependence between them. In this study, modeling has been done in order to know the interrelationships between the enablers with the help of Interpretive Structural Modeling and Cross Impact Matrix Multiplication Applied to Classification (MICMAC analysis for the performance of Indian corrugated packaging industries.

  20. Viewpoints: a framework for object oriented database modelling and distribution

    Directory of Open Access Journals (Sweden)

    Fouzia Benchikha

    2006-01-01

    Full Text Available The viewpoint concept has received widespread attention recently. Its integration into a data model improves the flexibility of the conventional object-oriented data model and allows one to improve the modelling power of objects. The viewpoint paradigm can be used as a means of providing multiple descriptions of an object and as a means of mastering the complexity of current database systems enabling them to be developed in a distributed manner. The contribution of this paper is twofold: to define an object data model integrating viewpoints in databases and to present a federated database system integrating multiple sources following a local-as-extended-view approach.

  1. Integrating predictive frameworks and cognitive models of face perception.

    Science.gov (United States)

    Trapp, Sabrina; Schweinberger, Stefan R; Hayward, William G; Kovács, Gyula

    2018-02-08

    The idea of a "predictive brain"-that is, the interpretation of internal and external information based on prior expectations-has been elaborated intensely over the past decade. Several domains in cognitive neuroscience have embraced this idea, including studies in perception, motor control, language, and affective, social, and clinical neuroscience. Despite the various studies that have used face stimuli to address questions related to predictive processing, there has been surprisingly little connection between this work and established cognitive models of face recognition. Here we suggest that the predictive framework can serve as an important complement of established cognitive face models. Conversely, the link to cognitive face models has the potential to shed light on issues that remain open in predictive frameworks.

  2. Designing the Business Models for Circular Economy—Towards the Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Mateusz Lewandowski

    2016-01-01

    Full Text Available Switching from the current linear model of economy to a circular one has recently attracted increased attention from major global companies e.g., Google, Unilever, Renault, and policymakers attending the World Economic Forum. The reasons for this are the huge financial, social and environmental benefits. However, the global shift from one model of economy to another also concerns smaller companies on a micro-level. Thus, comprehensive knowledge on designing circular business models is needed to stimulate and foster implementation of the circular economy. Existing business models for the circular economy have limited transferability and there is no comprehensive framework supporting every kind of company in designing a circular business model. This study employs a literature review to identify and classify the circular economy characteristics according to a business model structure. The investigation in the eight sub-domains of research on circular business models was used to redefine the components of the business model canvas in the context of the circular economy. Two new components—the take-back system and adoption factors—have been identified, thereby leading to the conceptualization of an extended framework for the circular business model canvas. Additionally, the triple fit challenge has been recognized as an enabler of the transition towards a circular business model. Some directions for further research have been outlined, as well.

  3. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  4. A practical framework for the construction of a biotracing model: application to Salmonella in the pork slaughter chain.

    Science.gov (United States)

    Smid, J H; Swart, A N; Havelaar, A H; Pielaat, A

    2011-09-01

    A novel purpose of the use of mathematical models in quantitative microbial risk assessment (QMRA) is to identify the sources of microbial contamination in a food chain (i.e., biotracing). In this article we propose a framework for the construction of a biotracing model, eventually to be used in industrial food production chains where discrete numbers of products are processed that may be contaminated by a multitude of sources. The framework consists of steps in which a Monte Carlo model, simulating sequential events in the chain following a modular process risk modeling (MPRM) approach, is converted to a Bayesian belief network (BBN). The resulting model provides a probabilistic quantification of concentrations of a pathogen throughout a production chain. A BBN allows for updating the parameters of the model based on observational data, and global parameter sensitivity analysis is readily performed in a BBN. Moreover, a BBN enables "backward reasoning" when downstream data are available and is therefore a natural framework for answering biotracing questions. The proposed framework is illustrated with a biotracing model of Salmonella in the pork slaughter chain, based on a recently published Monte Carlo simulation model. This model, implemented as a BBN, describes the dynamics of Salmonella in a Dutch slaughterhouse and enables finding the source of contamination of specific carcasses at the end of the chain. © 2011 Society for Risk Analysis.

  5. A new framework for an electrophotographic printer model

    Science.gov (United States)

    Colon-Lopez, Fermin A.

    Digital halftoning is a printing technology that creates the illusion of continuous tone images for printing devices such as electrophotographic printers that can only produce a limited number of tone levels. Digital halftoning works because the human visual system has limited spatial resolution which blurs the printed dots of the halftone image, creating the gray sensation of a continuous tone image. Because the printing process is imperfect it introduces distortions to the halftone image. The quality of the printed image depends, among other factors, on the complex interactions between the halftone image, the printer characteristics, the colorant, and the printing substrate. Printer models are used to assist in the development of new types of halftone algorithms that are designed to withstand the effects of printer distortions. For example, model-based halftone algorithms optimize the halftone image through an iterative process that integrates a printer model within the algorithm. The two main goals of a printer model are to provide accurate estimates of the tone and of the spatial characteristics of the printed halftone pattern. Various classes of printer models, from simple tone calibrations to complex mechanistic models, have been reported in the literature. Existing models have one or more of the following limiting factors: they only predict tone reproduction, they depend on the halftone pattern, they require complex calibrations or complex calculations, they are printer specific, they reproduce unrealistic dot structures, and they are unable to adapt responses to new data. The two research objectives of this dissertation are (1) to introduce a new framework for printer modeling and (2) to demonstrate the feasibility of such a framework in building an electrophotographic printer model. The proposed framework introduces the concept of modeling a printer as a texture transformation machine. The basic premise is that modeling the texture differences between the

  6. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations.

    Science.gov (United States)

    Hu, Eric Y; Bouteiller, Jean-Marie C; Song, Dong; Baudry, Michel; Berger, Theodore W

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  7. Service and business model for technology enabled and home-based cardiac rehabilitation programs.

    Science.gov (United States)

    Sarela, Antti; Whittaker, Frank; Korhonen, Ilkka

    2009-01-01

    Cardiac rehabilitation programs are comprehensive life-style programs aimed at preventing recurrence of a cardiac event. However, the current programs have globally significantly low levels of uptake. Home-based model can be a viable alternative to hospital-based programs. We developed and analysed a service and business model for home based cardiac rehabilitation based on personal mentoring using mobile phones and web services. We analysed the different organizational and economical aspects of setting up and running the home based program and propose a potential business model for a sustainable and viable service. The model can be extended to management of other chronic conditions to enable transition from hospital and care centre based treatments to sustainable home-based care.

  8. An Observation Capability Metadata Model for EO Sensor Discovery in Sensor Web Enablement Environments

    Directory of Open Access Journals (Sweden)

    Chuli Hu

    2014-10-01

    Full Text Available Accurate and fine-grained discovery by diverse Earth observation (EO sensors ensures a comprehensive response to collaborative observation-required emergency tasks. This discovery remains a challenge in an EO sensor web environment. In this study, we propose an EO sensor observation capability metadata model that reuses and extends the existing sensor observation-related metadata standards to enable the accurate and fine-grained discovery of EO sensors. The proposed model is composed of five sub-modules, namely, ObservationBreadth, ObservationDepth, ObservationFrequency, ObservationQuality and ObservationData. The model is applied to different types of EO sensors and is formalized by the Open Geospatial Consortium Sensor Model Language 1.0. The GeosensorQuery prototype retrieves the qualified EO sensors based on the provided geo-event. An actual application to flood emergency observation in the Yangtze River Basin in China is conducted, and the results indicate that sensor inquiry can accurately achieve fine-grained discovery of qualified EO sensors and obtain enriched observation capability information. In summary, the proposed model enables an efficient encoding system that ensures minimum unification to represent the observation capabilities of EO sensors. The model functions as a foundation for the efficient discovery of EO sensors. In addition, the definition and development of this proposed EO sensor observation capability metadata model is a helpful step in extending the Sensor Model Language (SensorML 2.0 Profile for the description of the observation capabilities of EO sensors.

  9. Analysis of diet optimization models for enabling conditions for hypertrophic muscle enlargement in athletes

    Directory of Open Access Journals (Sweden)

    L. Matijević

    2013-01-01

    Full Text Available In this study mathematical models were created and used in diet optimization for an athlete – recreational bodybuilder for pretournament period. The main aim was to determine weekly menus that can enable conditions for the hypertrophic muscle enlargement and to reduce the fat mass in a body. Each daily offer was planned to contain six to seven meals but with respect to several user’s personal demands. Optimal carbohydrates, fat and protein ratio in diet for enabling hypertrophy, recommended in literature, was found to be 43:30:27 and was chosen as the target in this research. Variables included in models were presented dishes and constraints, observed values of the offers; price, mass of consumed food, energy, water and content of different nutrients. The general idea was to create the models and to compare different programs in solving a problem. LINDO and MS Excel were recognized as widely spread and were chosen for model testing and examination. Both programs were suggested weekly menus that were acceptable to the user and were met all recommendations and demands. Weekly menus were analysed and compared. Sensitivity tests from both programs were used to detect possible critical points in the menu. Used programs produced slightly different results but still with very high correlation between proposed weekly intakes (R2=0.99856, p<0.05 so both can be successfully used in the pretournament period of bodybuilding and recommended for this complex task.

  10. Adoption of mobile learning among 3g-enabled handheld users using extended technology acceptance model

    Directory of Open Access Journals (Sweden)

    Fadare Oluwaseun Gbenga

    2013-12-01

    Full Text Available This paper examines various constructs of an extended TAM, Technology Acceptance Model, that are theoretically influencing the adoption and acceptability of mobile learning among 3G enabled mobile users. Mobile learning activity- based, used for this study were drawn from behaviourist and “learning and teaching support” educational paradigms. An online and manual survey instruments were used to gather data. The structural equation modelling techniques were then employed to explain the adoption processes of hypothesized research model. A theoretical model ETAM is developed based on TAM. Our result proved that psychometric constructs of TAM can be extended and that ETAM is well suited, and of good pedagogical tool in understanding mobile learning among 3G enabled handheld devices in southwest part of Nigeria. Cognitive constructs, attitude toward m-learning, self-efficacy play significant roles in influencing behavioural intention for mobile learning, of which self-efficacy is the most importance construct. Implications of results and directions for future research are discussed.

  11. The ACTIVE conceptual framework as a structural equation model.

    Science.gov (United States)

    Gross, Alden L; Payne, Brennan R; Casanova, Ramon; Davoudzadeh, Pega; Dzierzewski, Joseph M; Farias, Sarah; Giovannetti, Tania; Ip, Edward H; Marsiske, Michael; Rebok, George W; Schaie, K Warner; Thomas, Kelsey; Willis, Sherry; Jones, Richard N

    2018-01-01

    Background/Study Context: Conceptual frameworks are analytic models at a high level of abstraction. Their operationalization can inform randomized trial design and sample size considerations. The Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) conceptual framework was empirically tested using structural equation modeling (N=2,802). ACTIVE was guided by a conceptual framework for cognitive training in which proximal cognitive abilities (memory, inductive reasoning, speed of processing) mediate treatment-related improvement in primary outcomes (everyday problem-solving, difficulty with activities of daily living, everyday speed, driving difficulty), which in turn lead to improved secondary outcomes (health-related quality of life, health service utilization, mobility). Measurement models for each proximal, primary, and secondary outcome were developed and tested using baseline data. Each construct was then combined in one model to evaluate fit (RMSEA, CFI, normalized residuals of each indicator). To expand the conceptual model and potentially inform future trials, evidence of modification of structural model parameters was evaluated by age, years of education, sex, race, and self-rated health status. Preconceived measurement models for memory, reasoning, speed of processing, everyday problem-solving, instrumental activities of daily living (IADL) difficulty, everyday speed, driving difficulty, and health-related quality of life each fit well to the data (all RMSEA .95). Fit of the full model was excellent (RMSEA = .038; CFI = .924). In contrast with previous findings from ACTIVE regarding who benefits from training, interaction testing revealed associations between proximal abilities and primary outcomes are stronger on average by nonwhite race, worse health, older age, and less education (p conceptual model. Findings suggest that the types of people who show intervention effects on cognitive performance potentially may be different from

  12. Mechanisms of Soil Aggregation: a biophysical modeling framework

    Science.gov (United States)

    Ghezzehei, T. A.; Or, D.

    2016-12-01

    Soil aggregation is one of the main crosscutting concepts in all sub-disciplines and applications of soil science from agriculture to climate regulation. The concept generally refers to adhesion of primary soil particles into distinct units that remain stable when subjected to disruptive forces. It is one of the most sensitive soil qualities that readily respond to disturbances such as cultivation, fire, drought, flooding, and changes in vegetation. These changes are commonly quantified and incorporated in soil models indirectly as alterations in carbon content and type, bulk density, aeration, permeability, as well as water retention characteristics. Soil aggregation that is primarily controlled by organic matter generally exhibits hierarchical organization of soil constituents into stable units that range in size from a few microns to centimeters. However, this conceptual model of soil aggregation as the key unifying mechanism remains poorly quantified and is rarely included in predictive soil models. Here we provide a biophysical framework for quantitative and predictive modeling of soil aggregation and its attendant soil characteristics. The framework treats aggregates as hotspots of biological, chemical and physical processes centered around roots and root residue. We keep track of the life cycle of an individual aggregate from it genesis in the rhizosphere, fueled by rhizodeposition and mediated by vigorous microbial activity, until its disappearance when the root-derived resources are depleted. The framework synthesizes current understanding of microbial life in porous media; water holding and soil binding capacity of biopolymers; and environmental controls on soil organic matter dynamics. The framework paves a way for integration of processes that are presently modeled as disparate or poorly coupled processes, including storage and protection of carbon, microbial activity, greenhouse gas fluxes, movement and storage of water, resistance of soils against

  13. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine

    2016-01-01

    Mock-Up Interface, which is used to link the models with the MPC system. The framework was used to develop and run initial thermal and CO2 models. Their performance and the implementation procedure are discussed in the present paper. The framework is going to be implemented in the MPC system planned...

  14. Development of an integrated modelling framework: comparing client-server and demand-driven control flow for model execution

    Science.gov (United States)

    Schmitz, Oliver; Karssenberg, Derek; de Jong, Kor; de Kok, Jean-Luc; de Jong, Steven M.

    2014-05-01

    The construction of hydrological models at the catchment or global scale depends on the integration of component models representing various environmental processes, often operating at different spatial and temporal discretisations. A flexible construction of spatio-temporal model components, a means to specify aggregation or disaggregation to bridge discretisation discrepancies, ease of coupling these into complex integrated models, and support for stochastic modelling and the assessment of model outputs are the desired functionalities for the development of integrated models. These functionalities are preferably combined into one modelling framework such that domain specialists can perform exploratory model development without the need to change their working environment. We implemented an integrated modelling framework in the Python programming language, providing support for 1) model construction and 2) model execution. The framework enables modellers to represent spatio-temporal processes or to specify spatio-temporal (dis)aggregation with map algebra operations provided by the PCRaster library. Model algebra operations can be used by the modeller to specify the exchange of data and therefore the coupling of components. The framework determines the control flow for the ordered execution based on the time steps and couplings of the model components given by the modeller. We implemented two different control flow mechanisms. First, a client-server approach is used with a central entity controlling the execution of the component models and steering the data exchange. Second, a demand-driven approach is used that triggers the execution of a component model when data is requested by a coupled component model. We show that both control flow mechanisms allow for the execution of stochastic, multi-scale integrated models. We examine the implications of each control flow mechanism on the terminology used by the modeller to specify integrated models, and illustrate the

  15. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials

    Science.gov (United States)

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A.; Burgueño, Juan; Bandeira e Sousa, Massaine; Crossa, José

    2018-01-01

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines (l) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. PMID:29476023

  16. Population balance models: a useful complementary modelling framework for future WWTP modelling.

    Science.gov (United States)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel; Vanrolleghem, Peter A; Gernaey, Krist V

    2015-01-01

    Population balance models (PBMs) represent a powerful modelling framework for the description of the dynamics of properties that are characterised by distributions. This distribution of properties under transient conditions has been demonstrated in many chemical engineering applications. Modelling efforts of several current and future unit processes in wastewater treatment plants could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot sufficiently capture the true behaviour and even lead to completely wrong conclusions. Examples of distributed properties are bubble size, floc size, crystal size or granule size. In these cases, PBMs can be used to develop new knowledge that can be embedded in our current models to improve their predictive capability. Hence, PBMs should be regarded as a complementary modelling framework to biokinetic models. This paper provides an overview of current applications, future potential and limitations of PBMs in the field of wastewater treatment modelling, thereby looking over the fence to other scientific disciplines.

  17. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    Science.gov (United States)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  18. A Simulink simulation framework of a MagLev model

    Energy Technology Data Exchange (ETDEWEB)

    Boudall, H.; Williams, R.D.; Giras, T.C. [University of Virginia, Charlottesville (United States). School of Enegineering and Applied Science

    2003-09-01

    This paper presents a three-degree-of-freedom model of a section of the magnetically levitated train Maglev. The Maglev system dealt with in this article utilizes electromagnetic levitation. Each MagLev vehicle section is viewed as two separate parts, namely a body and a chassis, coupled by a set of springs and dampers. The MagLev model includes the propulsion, the guidance and the levitation systems. The equations of motion are developed. A Simulink simulation framework is implemented in order to study the interaction between the different systems and the dynamics of a MagLev vehicle. The simulation framework will eventually serve as a tool to assist the design and development of the Maglev system in the United States of America. (author)

  19. Ames Culture Chamber System: Enabling Model Organism Research Aboard the international Space Station

    Science.gov (United States)

    Steele, Marianne

    2014-01-01

    Understanding the genetic, physiological, and behavioral effects of spaceflight on living organisms and elucidating the molecular mechanisms that underlie these effects are high priorities for NASA. Certain organisms, known as model organisms, are widely studied to help researchers better understand how all biological systems function. Small model organisms such as nem-atodes, slime mold, bacteria, green algae, yeast, and moss can be used to study the effects of micro- and reduced gravity at both the cellular and systems level over multiple generations. Many model organisms have sequenced genomes and published data sets on their transcriptomes and proteomes that enable scientific investigations of the molecular mechanisms underlying the adaptations of these organisms to space flight.

  20. Test-driven modeling and development of cloud-enabled cyber-physical smart systems

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2017-01-01

    . Using test-driven modeling (TDM) is likely to be the best way to design smart systems such that these qualities are ensured. However, the TDM methods that are applied to development of simpler systems do not scale to smart systems because the modeling technologies cannot handle the complexity and size...... of the systems. In this paper, we present a method for test-driven modeling that scales to very large and complex systems. The method uses a combination of formal verification of basic interactions, simulations of complex scenarios, and mathematical forecasting to predict system behavior and performance. We...... utilized the method to analyze, design and develop various scenarios for a cloud-enabled medical system. Our approach provides a versatile method that may be adapted and improved for future development of very large and complex smart systems in various domains....

  1. Population Balance Models: A useful complementary modelling framework for future WWTP modelling

    DEFF Research Database (Denmark)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel

    2014-01-01

    processes in WWTPs could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot sufficiently captured the true behaviour. Examples are bubble size...

  2. Kinetic Modeling of Accelerated Stability Testing Enabled by Second Harmonic Generation Microscopy.

    Science.gov (United States)

    Song, Zhengtian; Sarkar, Sreya; Vogt, Andrew D; Danzer, Gerald D; Smith, Casey J; Gualtieri, Ellen J; Simpson, Garth J

    2018-04-03

    The low limits of detection afforded by second harmonic generation (SHG) microscopy coupled with image analysis algorithms enabled quantitative modeling of the temperature-dependent crystallization of active pharmaceutical ingredients (APIs) within amorphous solid dispersions (ASDs). ASDs, in which an API is maintained in an amorphous state within a polymer matrix, are finding increasing use to address solubility limitations of small-molecule APIs. Extensive stability testing is typically performed for ASD characterization, the time frame for which is often dictated by the earliest detectable onset of crystal formation. Here a study of accelerated stability testing on ritonavir, a human immunodeficiency virus (HIV) protease inhibitor, has been conducted. Under the condition for accelerated stability testing at 50 °C/75%RH and 40 °C/75%RH, ritonavir crystallization kinetics from amorphous solid dispersions were monitored by SHG microscopy. SHG microscopy coupled by image analysis yielded limits of detection for ritonavir crystals as low as 10 ppm, which is about 2 orders of magnitude lower than other methods currently available for crystallinity detection in ASDs. The four decade dynamic range of SHG microscopy enabled quantitative modeling with an established (JMAK) kinetic model. From the SHG images, nucleation and crystal growth rates were independently determined.

  3. GEMFsim: A Stochastic Simulator for the Generalized Epidemic Modeling Framework

    OpenAIRE

    Sahneh, Faryad Darabi; Vajdi, Aram; Shakeri, Heman; Fan, Futing; Scoglio, Caterina

    2016-01-01

    The recently proposed generalized epidemic modeling framework (GEMF) \\cite{sahneh2013generalized} lays the groundwork for systematically constructing a broad spectrum of stochastic spreading processes over complex networks. This article builds an algorithm for exact, continuous-time numerical simulation of GEMF-based processes. Moreover the implementation of this algorithm, GEMFsim, is available in popular scientific programming platforms such as MATLAB, R, Python, and C; GEMFsim facilitates ...

  4. Business Modeling Framework For Personalization In Mobile Business Services

    OpenAIRE

    Pau, L-F.; Dits, J.

    2002-01-01

    textabstractIs presented the structure of a formal framework for personalization features for mobile business services, which can be used to drive the business modeling of M-business services from a service provider point of view. It also allows to compute the revenue as linked to personalization levels and features. A case study has been performed in the area of personalized location based mobile services

  5. Designing for Learning and Play - The Smiley Model as Framework

    DEFF Research Database (Denmark)

    Weitze, Charlotte Lærke

    2016-01-01

    This paper presents a framework for designing engaging learning experiences in games – the Smiley Model. In this Design-Based Research project, student-game-designers were learning inside a gamified learning design - while designing and implementing learning goals from curriculum into the small d...... was adult upper secondary general students as well as 7th grade primary school students. The intention with this article is to inspire future learning designers that would like to experiment with integrating learning and play....

  6. New framework for standardized notation in wastewater treatment modelling

    DEFF Research Database (Denmark)

    Corominas, L.; Rieger, L.; Takacs, I.

    2010-01-01

    Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new...... is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper...... is to establish a consistent set of rules that can be applied to existing and most importantly, future models. Applying the proposed notation should make it easier for everyone active in the wastewater treatment field to read, write and review documents describing modelling projects....

  7. Cloud-Enabled Space Weather Modeling and Data Assimilation Platform (CESWP)

    Science.gov (United States)

    Satchwill, B.; Rankin, R.; Shillington, J.; Toews, E.

    2010-12-01

    Multi-space-agency partnerships in the development and flight of space science payloads, and in sharing of complex models and data sets (including ground-based data sets), make a compelling case for providing standardized interfaces and platforms to access data and models. However, developing and executing simulations requires space physicists to either develop knowledge of specialized high-performance computing environments and environment-specific simulations, or run simulations multiple times serially in order to examine the results of different parametric inputs. Barriers also exist where data and models reside in different geographic locations, which is typically the case. The emergence of cloud computing, and its Infrastructure-as-a-Service (IaaS) variant, provides an opportunity to develop software architectures that reduce barriers to simulation development and use, while simultaneously reducing the proliferation of hardware in the research community, and all its inherent high cost. The Cloud-Enabled Space Weather Modeling and Data Assimilation Platform (CESWP) utilizes cloud technologies to dramatically improve the sustainability, flexibility and performance of research tools and services, enabling an attendant improvement in researcher productivity and research funding efficacy. CESWP integrates complex modeling and simulation functionality into the federated data capabilities of the Canadian Space Sciences Data Portal (http://cssdp.ca). The CESWP cloud is innovative in its use of a versatile IaaS approach to provision a space sciences cloud. The platform helps researchers deal with the explosion of new data sets that require international cooperation and complex modeling as part of their analysis. This paper will describe the current implementation of the CESWP private cloud, which is based on Eucalyptus, KVM, CentOS, and Amazon Web Services compatible API’s.

  8. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction.

    Science.gov (United States)

    Bandeira E Sousa, Massaine; Cuevas, Jaime; de Oliveira Couto, Evellyn Giselly; Pérez-Rodríguez, Paulino; Jarquín, Diego; Fritsche-Neto, Roberto; Burgueño, Juan; Crossa, Jose

    2017-06-07

    Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1) single-environment, main genotypic effect model (SM); (2) multi-environment, main genotypic effects model (MM); (3) multi-environment, single variance G×E deviation model (MDs); and (4) multi-environment, environment-specific variance G×E deviation model (MDe). Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB), and a nonlinear kernel Gaussian kernel (GK). The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets), having different numbers of maize hybrids evaluated in different environments for grain yield (GY), plant height (PH), and ear height (EH). Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK) had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied. Copyright © 2017 Bandeira e Sousa et al.

  9. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction

    Directory of Open Access Journals (Sweden)

    Massaine Bandeira e Sousa

    2017-06-01

    Full Text Available Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1 single-environment, main genotypic effect model (SM; (2 multi-environment, main genotypic effects model (MM; (3 multi-environment, single variance G×E deviation model (MDs; and (4 multi-environment, environment-specific variance G×E deviation model (MDe. Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB, and a nonlinear kernel Gaussian kernel (GK. The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets, having different numbers of maize hybrids evaluated in different environments for grain yield (GY, plant height (PH, and ear height (EH. Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied.

  10. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    Science.gov (United States)

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  11. Next generation framework for aquatic modeling of the Earth System

    Science.gov (United States)

    Fekete, B. M.; Wollheim, W. M.; Wisser, D.; Vörösmarty, C. J.

    2009-03-01

    Earth System model development is becoming an increasingly complex task. As scientists attempt to represent the physical and bio-geochemical processes and various feedback mechanisms in unprecedented detail, the models themselves are becoming increasingly complex. At the same time, the complexity of the surrounding IT infrastructure is growing as well. Earth System models must manage a vast amount of data in heterogeneous computing environments. Numerous development efforts are on the way to ease that burden and offer model development platforms that reduce IT challenges and allow scientists to focus on their science. While these new modeling frameworks (e.g. FMS, ESMF, CCA, OpenMI) do provide solutions to many IT challenges (performing input/output, managing space and time, establishing model coupling, etc.), they are still considerably complex and often have steep learning curves. The Next generation Framework for Aquatic Modeling of the Earth System (NextFrAMES, a revised version of FrAMES) have numerous similarities to those developed by other teams, but represents a novel model development paradigm. NextFrAMES is built around a modeling XML that lets modelers to express the overall model structure and provides an API for dynamically linked plugins to represent the processes. The model XML is executed by the NextFrAMES run-time engine that parses the model definition, loads the module plugins, performs the model I/O and executes the model calculations. NextFrAMES has a minimalistic view representing spatial domains and treats every domain (regardless of its layout such as grid, network tree, individual points, polygons, etc.) as vector of objects. NextFrAMES performs computations on multiple domains and interactions between different spatial domains are carried out through couplers. NextFrAMES allows processes to operate at different frequencies by providing rudimentary aggregation and disaggregation facilities. NextFrAMES was designed primarily for

  12. A framework for quantifying net benefits of alternative prognostic models

    DEFF Research Database (Denmark)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit......) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk...... risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing...

  13. New models for energy beam machining enable accurate generation of free forms.

    Science.gov (United States)

    Axinte, Dragos; Billingham, John; Bilbao Guillerna, Aitor

    2017-09-01

    We demonstrate that, despite differences in their nature, many energy beam controlled-depth machining processes (for example, waterjet, pulsed laser, focused ion beam) can be modeled using the same mathematical framework-a partial differential evolution equation that requires only simple calibrations to capture the physics of each process. The inverse problem can be solved efficiently through the numerical solution of the adjoint problem and leads to beam paths that generate prescribed three-dimensional features with minimal error. The viability of this modeling approach has been demonstrated by generating accurate free-form surfaces using three processes that operate at very different length scales and with different physical principles for material removal: waterjet, pulsed laser, and focused ion beam machining. Our approach can be used to accurately machine materials that are hard to process by other means for scalable applications in a wide variety of industries.

  14. Towards an Ontology-driven Framework to Enable Development of Personalized mHealth Solutions for Cancer Survivors' Engagement in Healthy Living.

    Science.gov (United States)

    Myneni, Sahiti; Amith, Muhammad; Geng, Yimin; Tao, Cui

    2015-01-01

    Adolescent and Young Adult (AYA) cancer survivors manage an array of health-related issues. Survivorship Care Plans (SCPs) have the potential to empower these young survivors by providing information regarding treatment summary, late-effects of cancer therapies, healthy lifestyle guidance, coping with work-life-health balance, and follow-up care. However, current mHealth infrastructure used to deliver SCPs has been limited in terms of flexibility, engagement, and reusability. The objective of this study is to develop an ontology-driven survivor engagement framework to facilitate rapid development of mobile apps that are targeted, extensible, and engaging. The major components include ontology models, patient engagement features, and behavioral intervention technologies. We apply the proposed framework to characterize individual building blocks ("survivor digilegos"), which form the basis for mHealth tools that address user needs across the cancer care continuum. Results indicate that the framework (a) allows identification of AYA survivorship components, (b) facilitates infusion of engagement elements, and (c) integrates behavior change constructs into the design architecture of survivorship applications. Implications for design of patient-engaging chronic disease management solutions are discussed.

  15. The shared circuits model (SCM): how control, mirroring, and simulation can enable imitation, deliberation, and mindreading.

    Science.gov (United States)

    Hurley, Susan

    2008-02-01

    Imitation, deliberation, and mindreading are characteristically human sociocognitive skills. Research on imitation and its role in social cognition is flourishing across various disciplines. Imitation is surveyed in this target article under headings of behavior, subpersonal mechanisms, and functions of imitation. A model is then advanced within which many of the developments surveyed can be located and explained. The shared circuits model (SCM) explains how imitation, deliberation, and mindreading can be enabled by subpersonal mechanisms of control, mirroring, and simulation. It is cast at a middle, functional level of description, that is, between the level of neural implementation and the level of conscious perceptions and intentional actions. The SCM connects shared informational dynamics for perception and action with shared informational dynamics for self and other, while also showing how the action/perception, self/other, and actual/possible distinctions can be overlaid on these shared informational dynamics. It avoids the common conception of perception and action as separate and peripheral to central cognition. Rather, it contributes to the situated cognition movement by showing how mechanisms for perceiving action can be built on those for active perception.;>;>The SCM is developed heuristically, in five layers that can be combined in various ways to frame specific ontogenetic or phylogenetic hypotheses. The starting point is dynamic online motor control, whereby an organism is closely attuned to its embedding environment through sensorimotor feedback. Onto this are layered functions of prediction and simulation of feedback, mirroring, simulation of mirroring, monitored inhibition of motor output, and monitored simulation of input. Finally, monitored simulation of input specifying possible actions plus inhibited mirroring of such possible actions can generate information about the possible as opposed to actual instrumental actions of others, and the

  16. A coupled modeling framework for sustainable watershed management in transboundary river basins

    Science.gov (United States)

    Furqan Khan, Hassaan; Yang, Y. C. Ethan; Xie, Hua; Ringler, Claudia

    2017-12-01

    There is a growing recognition among water resource managers that sustainable watershed management needs to not only account for the diverse ways humans benefit from the environment, but also incorporate the impact of human actions on the natural system. Coupled natural-human system modeling through explicit modeling of both natural and human behavior can help reveal the reciprocal interactions and co-evolution of the natural and human systems. This study develops a spatially scalable, generalized agent-based modeling (ABM) framework consisting of a process-based semi-distributed hydrologic model (SWAT) and a decentralized water system model to simulate the impacts of water resource management decisions that affect the food-water-energy-environment (FWEE) nexus at a watershed scale. Agents within a river basin are geographically delineated based on both political and watershed boundaries and represent key stakeholders of ecosystem services. Agents decide about the priority across three primary water uses: food production, hydropower generation and ecosystem health within their geographical domains. Agents interact with the environment (streamflow) through the SWAT model and interact with other agents through a parameter representing willingness to cooperate. The innovative two-way coupling between the water system model and SWAT enables this framework to fully explore the feedback of human decisions on the environmental dynamics and vice versa. To support non-technical stakeholder interactions, a web-based user interface has been developed that allows for role-play and participatory modeling. The generalized ABM framework is also tested in two key transboundary river basins, the Mekong River basin in Southeast Asia and the Niger River basin in West Africa, where water uses for ecosystem health compete with growing human demands on food and energy resources. We present modeling results for crop production, energy generation and violation of eco

  17. Modeling Geomagnetic Variations using a Machine Learning Framework

    Science.gov (United States)

    Cheung, C. M. M.; Handmer, C.; Kosar, B.; Gerules, G.; Poduval, B.; Mackintosh, G.; Munoz-Jaramillo, A.; Bobra, M.; Hernandez, T.; McGranaghan, R. M.

    2017-12-01

    We present a framework for data-driven modeling of Heliophysics time series data. The Solar Terrestrial Interaction Neural net Generator (STING) is an open source python module built on top of state-of-the-art statistical learning frameworks (traditional machine learning methods as well as deep learning). To showcase the capability of STING, we deploy it for the problem of predicting the temporal variation of geomagnetic fields. The data used includes solar wind measurements from the OMNI database and geomagnetic field data taken by magnetometers at US Geological Survey observatories. We examine the predictive capability of different machine learning techniques (recurrent neural networks, support vector machines) for a range of forecasting times (minutes to 12 hours). STING is designed to be extensible to other types of data. We show how STING can be used on large sets of data from different sensors/observatories and adapted to tackle other problems in Heliophysics.

  18. The ontology model of FrontCRM framework

    Science.gov (United States)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  19. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    control with the building designer. Consequence based design is defined by the specific use of integrated dynamic modeling, which includes the parametric capabilities of a scripting tool and building simulation features of a building performance simulation tool. The framework can lead to enhanced......In the wake of uncompromising requirements on building performance and the current emphasis on sustainability, including building energy and indoor environment, designing buildings involves elements of expertise of multiple disciplines. However, building performance analyses, including those...... of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...

  20. Conceptual model and economic experiments to explain nonpersistence and enable mechanism designs fostering behavioral change.

    Science.gov (United States)

    Djawadi, Behnud Mir; Fahr, René; Turk, Florian

    2014-12-01

    Medical nonpersistence is a worldwide problem of striking magnitude. Although many fields of studies including epidemiology, sociology, and psychology try to identify determinants for medical nonpersistence, comprehensive research to explain medical nonpersistence from an economics perspective is rather scarce. The aim of the study was to develop a conceptual framework that augments standard economic choice theory with psychological concepts of behavioral economics to understand how patients' preferences for discontinuing with therapy arise over the course of the medical treatment. The availability of such a framework allows the targeted design of mechanisms for intervention strategies. Our conceptual framework models the patient as an active economic agent who evaluates the benefits and costs for continuing with therapy. We argue that a combination of loss aversion and mental accounting operations explains why patients discontinue with therapy at a specific point in time. We designed a randomized laboratory economic experiment with a student subject pool to investigate the behavioral predictions. Subjects continue with therapy as long as experienced utility losses have to be compensated. As soon as previous losses are evened out, subjects perceive the marginal benefit of persistence lower than in the beginning of the treatment. Consequently, subjects start to discontinue with therapy. Our results highlight that concepts of behavioral economics capture the dynamic structure of medical nonpersistence better than does standard economic choice theory. We recommend that behavioral economics should be a mandatory part of the development of possible intervention strategies aimed at improving patients' compliance and persistence behavior. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. Computer-aided modeling framework – a generic modeling template for catalytic membrane fixed bed reactors

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2013-01-01

    This work focuses on development of computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured based on workflows for different general modeling tasks. The overall objective of this work is to support the model develope...... membrane fixed bed models is developed. The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene....

  2. A Framework for Improving Project Performance of Standard Design Models in Saudi Arabia

    Directory of Open Access Journals (Sweden)

    Shabbab Al-Otaib

    2013-07-01

    Full Text Available Improving project performance in the construction industry poses several challenges for stakeholders. Recently, there have been frequent calls for the importance of adopting standardisation in improving construction design as well as the process and a focus on learning mapping from other industries. The Saudi Ministry of Interior (SMoI has adopted a new Standard Design Model (SDM approach for the development of its construction programme to effectively manage its complex project portfolio and improve project performance. A review of existing literature indicates that despite the adoption of SDM repetitive projects, which enable learning from past mistakes and improving the performance of future projects, it has been realised that there is a lack of learning instruments to capture, store and disseminate Lessons Learnt (LL. This research proposes a framework for improving the project performance of SDMs in the Saudi construction industry. Eight case studies related to a typical standard design project were performed that included interviews with of 24 key stakeholders who are involved in the planning and implementation of SDM projects within the SMoI. The research identified 14 critical success factors CSFs have a direct impact on the SDM project performance. These are classified into three main CSF-related clusters: adaptability to the context; contract management; and construction management. A framework, which comprises the identified 14 CSFs, was developed, refined and validated through a workshop with 12 key stakeholders in the SMoI construction programme. Additionally, a framework implementation process map was developed. Web-based tools and KM were identified as core factors in the framework implementation strategy. Although many past CSF-related studies were conducted to develop a range of construction project performance improvement frameworks, the paper provides the first initiative to develop a framework to improve the performance of

  3. Social networks enabled coordination model for cost management of patient hospital admissions.

    Science.gov (United States)

    Uddin, Mohammed Shahadat; Hossain, Liaquat

    2011-09-01

    In this study, we introduce a social networks enabled coordination model for exploring the effect of network position of "patient," "physician," and "hospital" actors in a patient-centered care network that evolves during patient hospitalization period on the total cost of coordination. An actor is a node, which represents an entity such as individual and organization in a social network. In our analysis of actor networks and coordination in the healthcare literature, we identified that there is significant gap where a number of promising hospital coordination model have been developed (e.g., Guided Care Model, Chronic Care Model) for the current healthcare system focusing on quality of service and patient satisfaction. The health insurance dataset for total hip replacement (THR) from hospital contribution fund, a prominent Australian Health Insurance Company, are analyzed to examine our proposed coordination model. We consider network attributes of degree, connectedness, in-degree, out-degree, and tie strength to measure network position of actors. To measure the cost of coordination for a particular hospital, average of total hospitalization expenses for all THR hospital admissions is used. Results show that network positions of "patient," "physician," and "hospital" actors considering all hospital admissions that a particular hospital has have effect on the average of total hospitalization expenses of that hospital. These results can be used as guidelines to set up a cost-effective healthcare practice structure for patient hospitalization expenses. © 2011 National Association for Healthcare Quality.

  4. A Modelling Framework to Assess the Effect of Pressures on River Abiotic Habitat Conditions and Biota.

    Directory of Open Access Journals (Sweden)

    Jochem Kail

    Full Text Available River biota are affected by global reach-scale pressures, but most approaches for predicting biota of rivers focus on river reach or segment scale processes and habitats. Moreover, these approaches do not consider long-term morphological changes that affect habitat conditions. In this study, a modelling framework was further developed and tested to assess the effect of pressures at different spatial scales on reach-scale habitat conditions and biota. Ecohydrological and 1D hydrodynamic models were used to predict discharge and water quality at the catchment scale and the resulting water level at the downstream end of a study reach. Long-term reach morphology was modelled using empirical regime equations, meander migration and 2D morphodynamic models. The respective flow and substrate conditions in the study reach were predicted using a 2D hydrodynamic model, and the suitability of these habitats was assessed with novel habitat models. In addition, dispersal models for fish and macroinvertebrates were developed to assess the re-colonization potential and to finally compare habitat suitability and the availability/ability of species to colonize these habitats. Applicability was tested and model performance was assessed by comparing observed and predicted conditions in the lowland Treene River in northern Germany. Technically, it was possible to link the different models, but future applications would benefit from the development of open source software for all modelling steps to enable fully automated model runs. Future research needs concern the physical modelling of long-term morphodynamics, feedback of biota (e.g., macrophytes on abiotic habitat conditions, species interactions, and empirical data on the hydraulic habitat suitability and dispersal abilities of macroinvertebrates. The modelling framework is flexible and allows for including additional models and investigating different research and management questions, e.g., in climate impact

  5. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  6. Vulnerability Assessment Models to Drought: Toward a Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Kiumars Zarafshani

    2016-06-01

    Full Text Available Drought is regarded as a slow-onset natural disaster that causes inevitable damage to water resources and to farm life. Currently, crisis management is the basis of drought mitigation plans, however, thus far studies indicate that effective drought management strategies are based on risk management. As a primary tool in mitigating the impact of drought, vulnerability assessment can be used as a benchmark in drought mitigation plans and to enhance farmers’ ability to cope with drought. Moreover, literature pertaining to drought has focused extensively on its impact, only awarding limited attention to vulnerability assessment as a tool. Therefore, the main purpose of this paper is to develop a conceptual framework for designing a vulnerability model in order to assess farmers’ level of vulnerability before, during and after the onset of drought. Use of this developed drought vulnerability model would aid disaster relief workers by enhancing the adaptive capacity of farmers when facing the impacts of drought. The paper starts with the definition of vulnerability and outlines different frameworks on vulnerability developed thus far. It then identifies various approaches of vulnerability assessment and finally offers the most appropriate model. The paper concludes that the introduced model can guide drought mitigation programs in countries that are impacted the most by drought.

  7. A Formal Framework for Integrated Environment Modeling Systems

    Directory of Open Access Journals (Sweden)

    Gaofeng Zhang

    2017-02-01

    Full Text Available Integrated Environment Modeling (IEM has become more and more important for environmental studies and applications. IEM systems have also been extended from scientific studies to much wider practical application situations. The quality and improved efficiency of IEM systems have therefore become increasingly critical. Although many advanced and creative technologies have been adopted to improve the quality of IEM systems, there is scarcely any formal method for evaluating and improving them. This paper is devoted to proposing a formal method to improve the quality and the developing efficiency of IEM systems. Two primary contributions are made. Firstly, a formal framework for IEM is proposed. The framework not only reflects the static and dynamic features of IEM but also covers different views from variant roles throughout the IEM lifecycle. Secondly, the formal operational semantics corresponding to the former model of the IEM is derived in detail; it can be used as the basis for aiding automated integrated modeling and verifying the integrated model.

  8. Statistical analysis of road-vehicle-driver interaction as an enabler to designing behavioural models

    International Nuclear Information System (INIS)

    Chakravarty, T; Chowdhury, A; Ghose, A; Bhaumik, C; Balamuralidhar, P

    2014-01-01

    Telematics form an important technology enabler for intelligent transportation systems. By deploying on-board diagnostic devices, the signatures of vehicle vibration along with its location and time are recorded. Detailed analyses of the collected signatures offer deep insights into the state of the objects under study. Towards that objective, we carried out experiments by deploying telematics device in one of the office bus that ferries employees to office and back. Data is being collected from 3-axis accelerometer, GPS, speed and the time for all the journeys. In this paper, we present initial results of the above exercise by applying statistical methods to derive information through systematic analysis of the data collected over four months. It is demonstrated that the higher order derivative of the measured Z axis acceleration samples display the properties Weibull distribution when the time axis is replaced by the amplitude of such processed acceleration data. Such an observation offers us a method to predict future behaviour where deviations from prediction are classified as context-based aberrations or progressive degradation of the system. In addition we capture the relationship between speed of the vehicle and median of the jerk energy samples using regression analysis. Such results offer an opportunity to develop a robust method to model road-vehicle interaction thereby enabling us to predict such like driving behaviour and condition based maintenance etc

  9. A Novel Experimental and Modelling Strategy for Nanoparticle Toxicity Testing Enabling the Use of Small Quantities

    Directory of Open Access Journals (Sweden)

    Marinda van Pomeren

    2017-11-01

    Full Text Available Metallic nanoparticles (NPs differ from other metal forms with respect to their large surface to volume ratio and subsequent inherent reactivity. Each new modification to a nanoparticle alters the surface to volume ratio, fate and subsequently the toxicity of the particle. Newly-engineered NPs are commonly available only in low quantities whereas, in general, rather large amounts are needed for fate characterizations and effect studies. This challenge is especially relevant for those NPs that have low inherent toxicity combined with low bioavailability. Therefore, within our study, we developed new testing strategies that enable working with low quantities of NPs. The experimental testing method was tailor-made for NPs, whereas we also developed translational models based on different dose-metrics allowing to determine dose-response predictions for NPs. Both the experimental method and the predictive models were verified on the basis of experimental effect data collected using zebrafish embryos exposed to metallic NPs in a range of different chemical compositions and shapes. It was found that the variance in the effect data in the dose-response predictions was best explained by the minimal diameter of the NPs, whereas the data confirmed that the predictive model is widely applicable to soluble metallic NPs. The experimental and model approach developed in our study support the development of (ecotoxicity assays tailored to nano-specific features.

  10. A Novel Experimental and Modelling Strategy for Nanoparticle Toxicity Testing Enabling the Use of Small Quantities.

    Science.gov (United States)

    van Pomeren, Marinda; Peijnenburg, Willie J G M; Brun, Nadja R; Vijver, Martina G

    2017-11-06

    Metallic nanoparticles (NPs) differ from other metal forms with respect to their large surface to volume ratio and subsequent inherent reactivity. Each new modification to a nanoparticle alters the surface to volume ratio, fate and subsequently the toxicity of the particle. Newly-engineered NPs are commonly available only in low quantities whereas, in general, rather large amounts are needed for fate characterizations and effect studies. This challenge is especially relevant for those NPs that have low inherent toxicity combined with low bioavailability. Therefore, within our study, we developed new testing strategies that enable working with low quantities of NPs. The experimental testing method was tailor-made for NPs, whereas we also developed translational models based on different dose-metrics allowing to determine dose-response predictions for NPs. Both the experimental method and the predictive models were verified on the basis of experimental effect data collected using zebrafish embryos exposed to metallic NPs in a range of different chemical compositions and shapes. It was found that the variance in the effect data in the dose-response predictions was best explained by the minimal diameter of the NPs, whereas the data confirmed that the predictive model is widely applicable to soluble metallic NPs. The experimental and model approach developed in our study support the development of (eco)toxicity assays tailored to nano-specific features.

  11. Modeling of RFID-Enabled Real-Time Manufacturing Execution System in Mixed-Model Assembly Lines

    Directory of Open Access Journals (Sweden)

    Zhixin Yang

    2015-01-01

    Full Text Available To quickly respond to the diverse product demands, mixed-model assembly lines are well adopted in discrete manufacturing industries. Besides the complexity in material distribution, mixed-model assembly involves a variety of components, different process plans and fast production changes, which greatly increase the difficulty for agile production management. Aiming at breaking through the bottlenecks in existing production management, a novel RFID-enabled manufacturing execution system (MES, which is featured with real-time and wireless information interaction capability, is proposed to identify various manufacturing objects including WIPs, tools, and operators, etc., and to trace their movements throughout the production processes. However, being subject to the constraints in terms of safety stock, machine assignment, setup, and scheduling requirements, the optimization of RFID-enabled MES model for production planning and scheduling issues is a NP-hard problem. A new heuristical generalized Lagrangian decomposition approach has been proposed for model optimization, which decomposes the model into three subproblems: computation of optimal configuration of RFID senor networks, optimization of production planning subjected to machine setup cost and safety stock constraints, and optimization of scheduling for minimized overtime. RFID signal processing methods that could solve unreliable, redundant, and missing tag events are also described in detail. The model validity is discussed through algorithm analysis and verified through numerical simulation. The proposed design scheme has important reference value for the applications of RFID in multiple manufacturing fields, and also lays a vital research foundation to leverage digital and networked manufacturing system towards intelligence.

  12. Machine learning methods enable predictive modeling of antibody feature:function relationships in RV144 vaccinees.

    Directory of Open Access Journals (Sweden)

    Ickwon Choi

    2015-04-01

    Full Text Available The adaptive immune response to vaccination or infection can lead to the production of specific antibodies to neutralize the pathogen or recruit innate immune effector cells for help. The non-neutralizing role of antibodies in stimulating effector cell responses may have been a key mechanism of the protection observed in the RV144 HIV vaccine trial. In an extensive investigation of a rich set of data collected from RV144 vaccine recipients, we here employ machine learning methods to identify and model associations between antibody features (IgG subclass and antigen specificity and effector function activities (antibody dependent cellular phagocytosis, cellular cytotoxicity, and cytokine release. We demonstrate via cross-validation that classification and regression approaches can effectively use the antibody features to robustly predict qualitative and quantitative functional outcomes. This integration of antibody feature and function data within a machine learning framework provides a new, objective approach to discovering and assessing multivariate immune correlates.

  13. Exploring uncertainty and model predictive performance concepts via a modular snowmelt-runoff modeling framework

    Science.gov (United States)

    Tyler Jon Smith; Lucy Amanda. Marshall

    2010-01-01

    Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...

  14. Spatial Modeling for Resources Framework (SMRF): A modular framework for developing spatial forcing data for snow modeling in mountain basins

    Science.gov (United States)

    Havens, Scott; Marks, Danny; Kormos, Patrick; Hedrick, Andrew

    2017-12-01

    In the Western US and many mountainous regions of the world, critical water resources and climate conditions are difficult to monitor because the observation network is generally very sparse. The critical resource from the mountain snowpack is water flowing into streams and reservoirs that will provide for irrigation, flood control, power generation, and ecosystem services. Water supply forecasting in a rapidly changing climate has become increasingly difficult because of non-stationary conditions. In response, operational water supply managers have begun to move from statistical techniques towards the use of physically based models. As we begin to transition physically based models from research to operational use, we must address the most difficult and time-consuming aspect of model initiation: the need for robust methods to develop and distribute the input forcing data. In this paper, we present a new open source framework, the Spatial Modeling for Resources Framework (SMRF), which automates and simplifies the common forcing data distribution methods. It is computationally efficient and can be implemented for both research and operational applications. We present an example of how SMRF is able to generate all of the forcing data required to a run physically based snow model at 50-100 m resolution over regions of 1000-7000 km2. The approach has been successfully applied in real time and historical applications for both the Boise River Basin in Idaho, USA and the Tuolumne River Basin in California, USA. These applications use meteorological station measurements and numerical weather prediction model outputs as input. SMRF has significantly streamlined the modeling workflow, decreased model set up time from weeks to days, and made near real-time application of a physically based snow model possible.

  15. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  16. An Integrated Framework Advancing Membrane Protein Modeling and Design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2015-09-01

    Full Text Available Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1 prediction of free energy changes upon mutation; (2 high-resolution structural refinement; (3 protein-protein docking; and (4 assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design.

  17. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  18. A Framework for Bioacoustic Vocalization Analysis Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Ebenezer Out-Nyarko

    2009-11-01

    Full Text Available Using Hidden Markov Models (HMMs as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks.

  19. GIFMod: A Flexible Modeling Framework For Hydraulic and Water Quality Performance Assessment of Stormwater Green Infrastructure

    Science.gov (United States)

    A flexible framework has been created for modeling multi-dimensional hydrological and water quality processes within stormwater green infrastructures (GIs). The framework models a GI system using a set of blocks (spatial features) and connectors (interfaces) representing differen...

  20. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so...

  1. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  2. A constitutive model for magnetostriction based on thermodynamic framework

    International Nuclear Information System (INIS)

    Ho, Kwangsoo

    2016-01-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto–mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature. - Highlights: • A thermodynamically consistent model is proposed to describe the magneto-mechanical coupling effect. • Internal state variables are introduced to capture the dissipative material response. • The evolution rate of the magnetostrictive strain is derived through thermodynamic and dissipation potentials.

  3. Enabling Data Fusion via a Common Data Model and Programming Interface

    Science.gov (United States)

    Lindholm, D. M.; Wilson, A.

    2011-12-01

    Much progress has been made in scientific data interoperability, especially in the areas of metadata and discovery. However, while a data user may have improved techniques for finding data, there is often a large chasm to span when it comes to acquiring the desired subsets of various datasets and integrating them into a data processing environment. Some tools such as OPeNDAP servers and the Unidata Common Data Model (CDM) have introduced improved abstractions for accessing data via a common interface, but they alone do not go far enough to enable fusion of data from multidisciplinary sources. Although data from various scientific disciplines may represent semantically similar concepts (e.g. time series), the user may face widely varying structural representations of the data (e.g. row versus column oriented), not to mention radically different storage formats. It is not enough to convert data to a common format. The key to fusing scientific data is to represent each dataset with consistent sampling. This can best be done by using a data model that expresses the functional relationship that each dataset represents. The domain of those functions determines how the data can be combined. The Visualization for Algorithm Development (VisAD) Java API has provided a sophisticated data model for representing the functional nature of scientific datasets for well over a decade. Because VisAD is largely designed for its visualization capabilities, the data model can be cumbersome to use for numerical computation, especially for those not comfortable with Java. Although both VisAD and the implementation of the CDM are written in Java, neither defines a pure Java interface that others could implement and program to, further limiting potential for interoperability. In this talk, we will present a solution for data integration based on a simple discipline-agnostic scientific data model and programming interface that enables a dataset to be defined in terms of three variable types

  4. IMPEx : enabling model/observational data comparison in planetary plasma sciences

    Science.gov (United States)

    Génot, V.; Khodachenko, M.; Kallio, E. J.; Al-Ubaidi, T.; Alexeev, I. I.; Topf, F.; Gangloff, M.; André, N.; Bourrel, N.; Modolo, R.; Hess, S.; Perez-Suarez, D.; Belenkaya, E. S.; Kalegaev, V.

    2013-09-01

    The FP7 IMPEx infrastructure, whose general goal is to encourage and facilitate inter-comparison between observational and model data in planetary plasma sciences, is now established for 2 years. This presentation will focus on a tour of the different achievements which occurred during this period. Within the project, data originate from multiple sources : large observational databases (CDAWeb, AMDA at CDPP, CLWeb at IRAP), simulation databases for hybrid and MHD codes (FMI, LATMOS), planetary magnetic field models database and online services (SINP). Each of these databases proposes dedicated access to their models and runs (HWA@FMI, LATHYS@LATMOS, SMDC@SINP). To gather this large data ensemble, IMPEx offers a distributed framework in which these data may be visualized, analyzed, and shared thanks to interoperable tools; they comprise of AMDA - an online space physics analysis tool -, 3DView - a tool for data visualization in 3D planetary context -, and CLWeb - an online space physics visualization tool. A simulation data model, based on SPASE, has been designed to ease data exchange within the infrastructure. On the communication point of view, the VO paradigm has been retained and the architecture is based on web services and the IVOA protocol SAMP. The presentation will focus on how the tools may be operated synchronously to manipulate these heterogeneous data sets. Use cases based on in-flight missions and associated model runs will be proposed for the demonstration. Finally the motivation and functionalities of the future IMPEx portal will be exposed. As requirements to and potentialities of joining the IMPEx infrastructure will be shown, the presentation could be seen as an invitation to other modeling teams in the community which may be interested to promote their results via IMPEx.

  5. A MULTISCALE, CELL-BASED FRAMEWORK FOR MODELING CANCER DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    JIANG, YI [Los Alamos National Laboratory

    2007-01-16

    Cancer remains to be one of the leading causes of death due to diseases. We use a systems approach that combines mathematical modeling, numerical simulation, in vivo and in vitro experiments, to develop a predictive model that medical researchers can use to study and treat cancerous tumors. The multiscale, cell-based model includes intracellular regulations, cellular level dynamics and intercellular interactions, and extracellular level chemical dynamics. The intracellular level protein regulations and signaling pathways are described by Boolean networks. The cellular level growth and division dynamics, cellular adhesion and interaction with the extracellular matrix is described by a lattice Monte Carlo model (the Cellular Potts Model). The extracellular dynamics of the signaling molecules and metabolites are described by a system of reaction-diffusion equations. All three levels of the model are integrated through a hybrid parallel scheme into a high-performance simulation tool. The simulation results reproduce experimental data in both avasular tumors and tumor angiogenesis. By combining the model with experimental data to construct biologically accurate simulations of tumors and their vascular systems, this model will enable medical researchers to gain a deeper understanding of the cellular and molecular interactions associated with cancer progression and treatment.

  6. Building and analyzing timed influence net models with internet-enabled pythia

    Science.gov (United States)

    Pachowicz, Peter W.; Wagenhals, Lee W.; Pham, John; Levis, Alexander H.

    2007-04-01

    The most recent client-server version of Pythia modeling software is presented. Pythia is a software implementation of a Bayesian Net framework and is used for course of action development, evaluation, and selection in the context of effects-based planning. A new version, Pythia 1.5, is a part of a larger suite of tools for behavioral influence analysis, brought into the state-of-the-art client-server computing environment. This server application for multi-user and multiprocess computing relies on the Citrix Presentation Server for integration, security and maintenance. While Pythia's process is run on a server, the input/output services are controlled and displayed through a client PC. Example use of Pythia is illustrated through its application to a suppression of IED activity in an Iraqi province. This case study demonstrates how analysts can create executable (probabilistic) models that link potential actions to effects, based on knowledge about the cultural and social environment. Both the tool and the process for creating and analyzing the model are described as well as the benefits of using the new server based version of the tool.

  7. Sol-Terra - AN Operational Space Weather Forecasting Model Framework

    Science.gov (United States)

    Bisi, M. M.; Lawrence, G.; Pidgeon, A.; Reid, S.; Hapgood, M. A.; Bogdanova, Y.; Byrne, J.; Marsh, M. S.; Jackson, D.; Gibbs, M.

    2015-12-01

    The SOL-TERRA project is a collaboration between RHEA Tech, the Met Office, and RAL Space funded by the UK Space Agency. The goal of the SOL-TERRA project is to produce a Roadmap for a future coupled Sun-to-Earth operational space weather forecasting system covering domains from the Sun down to the magnetosphere-ionosphere-thermosphere and neutral atmosphere. The first stage of SOL-TERRA is underway and involves reviewing current models that could potentially contribute to such a system. Within a given domain, the various space weather models will be assessed how they could contribute to such a coupled system. This will be done both by reviewing peer reviewed papers, and via direct input from the model developers to provide further insight. Once the models have been reviewed then the optimal set of models for use in support of forecast-based SWE modelling will be selected, and a Roadmap for the implementation of an operational forecast-based SWE modelling framework will be prepared. The Roadmap will address the current modelling capability, knowledge gaps and further work required, and also the implementation and maintenance of the overall architecture and environment that the models will operate within. The SOL-TERRA project will engage with external stakeholders in order to ensure independently that the project remains on track to meet its original objectives. A group of key external stakeholders have been invited to provide their domain-specific expertise in reviewing the SOL-TERRA project at critical stages of Roadmap preparation; namely at the Mid-Term Review, and prior to submission of the Final Report. This stakeholder input will ensure that the SOL-TERRA Roadmap will be enhanced directly through the input of modellers and end-users. The overall goal of the SOL-TERRA project is to develop a Roadmap for an operational forecast-based SWE modelling framework with can be implemented within a larger subsequent activity. The SOL-TERRA project is supported within

  8. Development of a distributed air pollutant dry deposition modeling framework

    International Nuclear Information System (INIS)

    Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J.

    2012-01-01

    A distributed air pollutant dry deposition modeling system was developed with a geographic information system (GIS) to enhance the functionality of i-Tree Eco (i-Tree, 2011). With the developed system, temperature, leaf area index (LAI) and air pollutant concentration in a spatially distributed form can be estimated, and based on these and other input variables, dry deposition of carbon monoxide (CO), nitrogen dioxide (NO 2 ), sulfur dioxide (SO 2 ), and particulate matter less than 10 microns (PM10) to trees can be spatially quantified. Employing nationally available road network, traffic volume, air pollutant emission/measurement and meteorological data, the developed system provides a framework for the U.S. city managers to identify spatial patterns of urban forest and locate potential areas for future urban forest planting and protection to improve air quality. To exhibit the usability of the framework, a case study was performed for July and August of 2005 in Baltimore, MD. - Highlights: ► A distributed air pollutant dry deposition modeling system was developed. ► The developed system enhances the functionality of i-Tree Eco. ► The developed system employs nationally available input datasets. ► The developed system is transferable to any U.S. city. ► Future planting and protection spots were visually identified in a case study. - Employing nationally available datasets and a GIS, this study will provide urban forest managers in U.S. cities a framework to quantify and visualize urban forest structure and its air pollution removal effect.

  9. A unified framework for benchmark dose estimation applied to mixed models and model averaging

    DEFF Research Database (Denmark)

    Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

    2013-01-01

    This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...... continuous and quantal data, facilitating benchmark dose estimation in general for a wide range of candidate models commonly used in toxicology. Moreover, the proposed framework provides a convenient means for extending benchmark dose concepts through the use of model averaging and random effects modeling...... provides slightly conservative, yet useful, estimates of benchmark dose lower limit under realistic scenarios....

  10. The Community Earth System Model: A Framework for Collaborative Research

    Energy Technology Data Exchange (ETDEWEB)

    Hurrell, Jim; Holland, Marika M.; Gent, Peter R.; Ghan, Steven J.; Kay, Jennifer; Kushner, P.; Lamarque, J.-F.; Large, William G.; Lawrence, David M.; Lindsay, Keith; Lipscomb, William; Long , Matthew; Mahowald, N.; Marsh, D.; Neale, Richard; Rasch, Philip J.; Vavrus, Steven J.; Vertenstein, Mariana; Bader, David C.; Collins, William D.; Hack, James; Kiehl, J. T.; Marshall, Shawn

    2013-09-30

    The Community Earth System Model (CESM) is a flexible and extensible community tool used to investigate a diverse set of earth system interactions across multiple time and space scales. This global coupled model is a natural evolution from its predecessor, the Community Climate System Model, following the incorporation of new earth system capabilities. These include the ability to simulate biogeochemical cycles, atmospheric chemistry, ice sheets, and a high-top atmosphere. These and other new model capabilities are enabling investigations into a wide range of pressing scientific questions, providing new predictive capabilities and increasing our collective knowledge about the behavior and interactions of the earth system. Simulations with numerous configurations of the CESM have been provided to the Coupled Model Intercomparison Project Phase 5 (CMIP5) and are being analyzed by the broader community of scientists. Additionally, the model source code and associated documentation are freely available to the scientific community to use for earth system studies, making it a true community tool. Here we describe this earth modeling system, its various possible configurations, and illustrate its capabilities with a few science highlights.

  11. Nonlinear Synapses for Large-Scale Models: An Efficient Representation Enables Complex Synapse Dynamics Modeling in Large-Scale Simulations

    Directory of Open Access Journals (Sweden)

    Eric eHu

    2015-09-01

    Full Text Available Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  12. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  13. A framework for quantifying net benefits of alternative prognostic models.

    Science.gov (United States)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.

  14. A Model-driven Framework for Educational Game Design

    Directory of Open Access Journals (Sweden)

    Bill Roungas

    2016-09-01

    Full Text Available Educational games are a class of serious games whose main purpose is to teach some subject to their players. Despite the many existing design frameworks, these games are too often created in an ad-hoc manner, and typically without the use of a game design document (GDD. We argue that a reason for this phenomenon is that current ways to structure, create and update GDDs do not increase the value of the artifact in the design and development process. As a solution, we propose a model-driven, web-based knowledge management environment that supports game designers in the creation of a GDD that accounts for and relates educational and entertainment game elements. The foundation of our approach is our devised conceptual model for educational games, which also defines the structure of the design environment. We present promising results from an evaluation of our environment with eight experts in serious games.

  15. A Categorical Framework for Model Classification in the Geosciences

    Science.gov (United States)

    Hauhs, Michael; Trancón y Widemann, Baltasar; Lange, Holger

    2016-04-01

    Models have a mixed record of success in the geosciences. In meteorology, model development and implementation has been among the first and most successful examples of triggering computer technology in science. On the other hand, notorious problems such as the 'equifinality issue' in hydrology lead to a rather mixed reputation of models in other areas. The most successful models in geosciences are applications of dynamic systems theory to non-living systems or phenomena. Thus, we start from the hypothesis that the success of model applications relates to the influence of life on the phenomenon under study. We thus focus on the (formal) representation of life in models. The aim is to investigate whether disappointment in model performance is due to system properties such as heterogeneity and historicity of ecosystems, or rather reflects an abstraction and formalisation problem at a fundamental level. As a formal framework for this investigation, we use category theory as applied in computer science to specify behaviour at an interface. Its methods have been developed for translating and comparing formal structures among different application areas and seems highly suited for a classification of the current "model zoo" in the geosciences. The approach is rather abstract, with a high degree of generality but a low level of expressibility. Here, category theory will be employed to check the consistency of assumptions about life in different models. It will be shown that it is sufficient to distinguish just four logical cases to check for consistency of model content. All four cases can be formalised as variants of coalgebra-algebra homomorphisms. It can be demonstrated that transitions between the four variants affect the relevant observations (time series or spatial maps), the formalisms used (equations, decision trees) and the test criteria of success (prediction, classification) of the resulting model types. We will present examples from hydrology and ecology in

  16. Design theoretic analysis of three system modeling frameworks.

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, Michael James

    2007-05-01

    This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.

  17. Integrated Bayesian network framework for modeling complex ecological issues.

    Science.gov (United States)

    Johnson, Sandra; Mengersen, Kerrie

    2012-07-01

    The management of environmental problems is multifaceted, requiring varied and sometimes conflicting objectives and perspectives to be considered. Bayesian network (BN) modeling facilitates the integration of information from diverse sources and is well suited to tackling the management challenges of complex environmental problems. However, combining several perspectives in one model can lead to large, unwieldy BNs that are difficult to maintain and understand. Conversely, an oversimplified model may lead to an unrealistic representation of the environmental problem. Environmental managers require the current research and available knowledge about an environmental problem of interest to be consolidated in a meaningful way, thereby enabling the assessment of potential impacts and different courses of action. Previous investigations of the environmental problem of interest may have already resulted in the construction of several disparate ecological models. On the other hand, the opportunity may exist to initiate this modeling. In the first instance, the challenge is to integrate existing models and to merge the information and perspectives from these models. In the second instance, the challenge is to include different aspects of the environmental problem incorporating both the scientific and management requirements. Although the paths leading to the combined model may differ for these 2 situations, the common objective is to design an integrated model that captures the available information and research, yet is simple to maintain, expand, and refine. BN modeling is typically an iterative process, and we describe a heuristic method, the iterative Bayesian network development cycle (IBNDC), for the development of integrated BN models that are suitable for both situations outlined above. The IBNDC approach facilitates object-oriented BN (OOBN) modeling, arguably viewed as the next logical step in adaptive management modeling, and that embraces iterative development

  18. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    Science.gov (United States)

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  19. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  20. Genome-Enabled Modeling of Biogeochemical Processes Predicts Metabolic Dependencies that Connect the Relative Fitness of Microbial Functional Guilds

    Science.gov (United States)

    Brodie, E.; King, E.; Molins, S.; Karaoz, U.; Steefel, C. I.; Banfield, J. F.; Beller, H. R.; Anantharaman, K.; Ligocki, T. J.; Trebotich, D.

    2015-12-01

    Pore-scale processes mediated by microorganisms underlie a range of critical ecosystem services, regulating carbon stability, nutrient flux, and the purification of water. Advances in cultivation-independent approaches now provide us with the ability to reconstruct thousands of genomes from microbial populations from which functional roles may be assigned. With this capability to reveal microbial metabolic potential, the next step is to put these microbes back where they belong to interact with their natural environment, i.e. the pore scale. At this scale, microorganisms communicate, cooperate and compete across their fitness landscapes with communities emerging that feedback on the physical and chemical properties of their environment, ultimately altering the fitness landscape and selecting for new microbial communities with new properties and so on. We have developed a trait-based model of microbial activity that simulates coupled functional guilds that are parameterized with unique combinations of traits that govern fitness under dynamic conditions. Using a reactive transport framework, we simulate the thermodynamics of coupled electron donor-acceptor reactions to predict energy available for cellular maintenance, respiration, biomass development, and enzyme production. From metagenomics, we directly estimate some trait values related to growth and identify the linkage of key traits associated with respiration and fermentation, macromolecule depolymerizing enzymes, and other key functions such as nitrogen fixation. Our simulations were carried out to explore abiotic controls on community emergence such as seasonally fluctuating water table regimes across floodplain organic matter hotspots. Simulations and metagenomic/metatranscriptomic observations highlighted the many dependencies connecting the relative fitness of functional guilds and the importance of chemolithoautotrophic lifestyles. Using an X-Ray microCT-derived soil microaggregate physical model combined

  1. Proposed framework for thermomechanical life modeling of metal matrix composites

    Science.gov (United States)

    Halford, Gary R.; Lerch, Bradley A.; Saltsman, James F.

    1993-01-01

    The framework of a mechanics of materials model is proposed for thermomechanical fatigue (TMF) life prediction of unidirectional, continuous-fiber metal matrix composites (MMC's). Axially loaded MMC test samples are analyzed as structural components whose fatigue lives are governed by local stress-strain conditions resulting from combined interactions of the matrix, interfacial layer, and fiber constituents. The metallic matrix is identified as the vehicle for tracking fatigue crack initiation and propagation. The proposed framework has three major elements. First, TMF flow and failure characteristics of in situ matrix material are approximated from tests of unreinforced matrix material, and matrix TMF life prediction equations are numerically calibrated. The macrocrack initiation fatigue life of the matrix material is divided into microcrack initiation and microcrack propagation phases. Second, the influencing factors created by the presence of fibers and interfaces are analyzed, characterized, and documented in equation form. Some of the influences act on the microcrack initiation portion of the matrix fatigue life, others on the microcrack propagation life, while some affect both. Influencing factors include coefficient of thermal expansion mismatch strains, residual (mean) stresses, multiaxial stress states, off-axis fibers, internal stress concentrations, multiple initiation sites, nonuniform fiber spacing, fiber debonding, interfacial layers and cracking, fractured fibers, fiber deflections of crack fronts, fiber bridging of matrix cracks, and internal oxidation along internal interfaces. Equations exist for some, but not all, of the currently identified influencing factors. The third element is the inclusion of overriding influences such as maximum tensile strain limits of brittle fibers that could cause local fractures and ensuing catastrophic failure of surrounding matrix material. Some experimental data exist for assessing the plausibility of the proposed

  2. A modelling framework to simulate foliar fungal epidemics using functional-structural plant models.

    Science.gov (United States)

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-09-01

    Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional-structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant-environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both

  3. A multilevel framework to reconstruct anatomical 3D models of the hepatic vasculature in rat livers.

    Science.gov (United States)

    Peeters, Geert; Debbaut, Charlotte; Laleman, Wim; Monbaliu, Diethard; Vander Elst, Ingrid; Detrez, Jan R; Vandecasteele, Tim; De Schryver, Thomas; Van Hoorebeke, Luc; Favere, Kasper; Verbeke, Jonas; Segers, Patrick; Cornillie, Pieter; De Vos, Winnok H

    2017-03-01

    The intricate (micro)vascular architecture of the liver has not yet been fully unravelled. Although current models are often idealized simplifications of the complex anatomical reality, correct morphological information is instrumental for scientific and clinical purposes. Previously, both vascular corrosion casting (VCC) and immunohistochemistry (IHC) have been separately used to study the hepatic vasculature. Nevertheless, these techniques still face a number of challenges such as dual casting in VCC and limited imaging depths for IHC. We have optimized both techniques and combined their complementary strengths to develop a framework for multilevel reconstruction of the hepatic circulation in the rat. The VCC and micro-CT scanning protocol was improved by enabling dual casting, optimizing the contrast agent concentration, and adjusting the viscosity of the resin (PU4ii). IHC was improved with an optimized clearing technique (CUBIC) that extended the imaging depth for confocal microscopy more than five-fold. Using in-house developed software (DeLiver), the vascular network - in both VCC and IHC datasets - was automatically segmented and/or morphologically analysed. Our methodological framework allows 3D reconstruction and quantification of the hepatic circulation, ranging from the major blood vessels down to the intertwined and interconnected sinusoids. We believe that the presented framework will have value beyond studies of the liver, and will facilitate a better understanding of various parenchymal organs in general, in physiological and pathological circumstances. © 2016 Anatomical Society.

  4. How much cryosphere model complexity is just right? Exploration using the conceptual cryosphere hydrology framework

    Directory of Open Access Journals (Sweden)

    T. M. Mosier

    2016-09-01

    Full Text Available Making meaningful projections of the impacts that possible future climates would have on water resources in mountain regions requires understanding how cryosphere hydrology model performance changes under altered climate conditions and when the model is applied to ungaged catchments. Further, if we are to develop better models, we must understand which specific process representations limit model performance. This article presents a modeling tool, named the Conceptual Cryosphere Hydrology Framework (CCHF, that enables implementing and evaluating a wide range of cryosphere modeling hypotheses. The CCHF represents cryosphere hydrology systems using a set of coupled process modules that allows easily interchanging individual module representations and includes analysis tools to evaluate model outputs. CCHF version 1 (Mosier, 2016 implements model formulations that require only precipitation and temperature as climate inputs – for example variations on simple degree-index (SDI or enhanced temperature index (ETI formulations – because these model structures are often applied in data-sparse mountain regions, and perform relatively well over short periods, but their calibration is known to change based on climate and geography. Using CCHF, we implement seven existing and novel models, including one existing SDI model, two existing ETI models, and four novel models that utilize a combination of existing and novel module representations. The novel module representations include a heat transfer formulation with net longwave radiation and a snowpack internal energy formulation that uses an approximation of the cold content. We assess the models for the Gulkana and Wolverine glaciated watersheds in Alaska, which have markedly different climates and contain long-term US Geological Survey benchmark glaciers. Overall we find that the best performing models are those that are more physically consistent and representative, but no single model performs

  5. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2015-10-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  6. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2012-12-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  7. Concepts as Semantic Pointers: A Framework and Computational Model.

    Science.gov (United States)

    Blouw, Peter; Solodkin, Eugene; Thagard, Paul; Eliasmith, Chris

    2016-07-01

    The reconciliation of theories of concepts based on prototypes, exemplars, and theory-like structures is a longstanding problem in cognitive science. In response to this problem, researchers have recently tended to adopt either hybrid theories that combine various kinds of representational structure, or eliminative theories that replace concepts with a more finely grained taxonomy of mental representations. In this paper, we describe an alternative approach involving a single class of mental representations called "semantic pointers." Semantic pointers are symbol-like representations that result from the compression and recursive binding of perceptual, lexical, and motor representations, effectively integrating traditional connectionist and symbolic approaches. We present a computational model using semantic pointers that replicates experimental data from categorization studies involving each prior paradigm. We argue that a framework involving semantic pointers can provide a unified account of conceptual phenomena, and we compare our framework to existing alternatives in accounting for the scope, content, recursive combination, and neural implementation of concepts. Copyright © 2015 Cognitive Science Society, Inc.

  8. An Efficient Framework Model for Optimizing Routing Performance in VANETs

    Science.gov (United States)

    Zulkarnain, Zuriati Ahmad; Subramaniam, Shamala

    2018-01-01

    Routing in Vehicular Ad hoc Networks (VANET) is a bit complicated because of the nature of the high dynamic mobility. The efficiency of routing protocol is influenced by a number of factors such as network density, bandwidth constraints, traffic load, and mobility patterns resulting in frequency changes in network topology. Therefore, Quality of Service (QoS) is strongly needed to enhance the capability of the routing protocol and improve the overall network performance. In this paper, we introduce a statistical framework model to address the problem of optimizing routing configuration parameters in Vehicle-to-Vehicle (V2V) communication. Our framework solution is based on the utilization of the network resources to further reflect the current state of the network and to balance the trade-off between frequent changes in network topology and the QoS requirements. It consists of three stages: simulation network stage used to execute different urban scenarios, the function stage used as a competitive approach to aggregate the weighted cost of the factors in a single value, and optimization stage used to evaluate the communication cost and to obtain the optimal configuration based on the competitive cost. The simulation results show significant performance improvement in terms of the Packet Delivery Ratio (PDR), Normalized Routing Load (NRL), Packet loss (PL), and End-to-End Delay (E2ED). PMID:29462884

  9. Toward the quantification of a conceptual framework for movement ecology using circular statistical modeling.

    Science.gov (United States)

    Shimatani, Ichiro Ken; Yoda, Ken; Katsumata, Nobuhiro; Sato, Katsufumi

    2012-01-01

    To analyze an animal's movement trajectory, a basic model is required that satisfies the following conditions: the model must have an ecological basis and the parameters used in the model must have ecological interpretations, a broad range of movement patterns can be explained by that model, and equations and probability distributions in the model should be mathematically tractable. Random walk models used in previous studies do not necessarily satisfy these requirements, partly because movement trajectories are often more oriented or tortuous than expected from the models. By improving the modeling for turning angles, this study aims to propose a basic movement model. On the basis of the recently developed circular auto-regressive model, we introduced a new movement model and extended its applicability to capture the asymmetric effects of external factors such as wind. The model was applied to GPS trajectories of a seabird (Calonectris leucomelas) to demonstrate its applicability to various movement patterns and to explain how the model parameters are ecologically interpreted under a general conceptual framework for movement ecology. Although it is based on a simple extension of a generalized linear model to circular variables, the proposed model enables us to evaluate the effects of external factors on movement separately from the animal's internal state. For example, maximum likelihood estimates and model selection suggested that in one homing flight section, the seabird intended to fly toward the island, but misjudged its navigation and was driven off-course by strong winds, while in the subsequent flight section, the seabird reset the focal direction, navigated the flight under strong wind conditions, and succeeded in approaching the island.

  10. Expert judgment based multi-criteria decision model to address uncertainties in risk assessment of nanotechnology-enabled food products

    International Nuclear Information System (INIS)

    Flari, Villie; Chaudhry, Qasim; Neslo, Rabin; Cooke, Roger

    2011-01-01

    Currently, risk assessment of nanotechnology-enabled food products is considered difficult due to the large number of uncertainties involved. We developed an approach which could address some of the main uncertainties through the use of expert judgment. Our approach employs a multi-criteria decision model, based on probabilistic inversion that enables capturing experts’ preferences in regard to safety of nanotechnology-enabled food products, and identifying their opinions in regard to the significance of key criteria that are important in determining the safety of such products. An advantage of these sample-based techniques is that they provide out-of-sample validation and therefore a robust scientific basis. This validation in turn adds predictive power to the model developed. We achieved out-of-sample validation in two ways: (1) a portion of the expert preference data was excluded from the model’s fitting and was then predicted by the model fitted on the remaining rankings and (2) a (partially) different set of experts generated new scenarios, using the same criteria employed in the model, and ranked them; their ranks were compared with ranks predicted by the model. The degree of validation in each method was less than perfect but reasonably substantial. The validated model we applied captured and modelled experts’ preferences regarding safety of hypothetical nanotechnology-enabled food products. It appears therefore that such an approach can provide a promising route to explore further for assessing the risk of nanotechnology-enabled food products.

  11. A grey DEMATEL-based approach for modeling enablers of green innovation in manufacturing organizations.

    Science.gov (United States)

    Gupta, Himanshu; Barua, Mukesh Kumar

    2018-04-01

    Incorporating green practices into the manufacturing process has gained momentum over the past few years and is a matter of great concern for both manufacturers as well as researchers. Regulatory pressures in developed countries have forced the organizations to adopt green practices; however, this issue still lacks attention in developing economies like India. There is an urgent need to identify enablers of green innovation for manufacturing organizations and also to identify prominent enablers among those. This study is an attempt to first identify enablers of green innovation and then establish a causal relationship among them to identify the enablers that can drive others. Grey DEMATEL (Decision Making Trial and Evaluation Laboratory) methodology is used for establishing the causal relationship among enablers. The novelty of this study lies in the fact that no study has been done in the past to identify the enablers of green innovation and then establishing the causal relationship among them. A total of 21 enablers of green innovation have been identified; research indicates developing green manufacturing capabilities, resources for green innovation, ease of getting loans from financial institutions, and environmental regulations as the most influential enablers of green innovation. Managerial and practical implications of the research are also presented to assist managers of the case company in adopting green innovation practices at their end.

  12. Improving NASA's Multiscale Modeling Framework for Tropical Cyclone Climate Study

    Science.gov (United States)

    Shen, Bo-Wen; Nelson, Bron; Cheung, Samson; Tao, Wei-Kuo

    2013-01-01

    One of the current challenges in tropical cyclone (TC) research is how to improve our understanding of TC interannual variability and the impact of climate change on TCs. Recent advances in global modeling, visualization, and supercomputing technologies at NASA show potential for such studies. In this article, the authors discuss recent scalability improvement to the multiscale modeling framework (MMF) that makes it feasible to perform long-term TC-resolving simulations. The MMF consists of the finite-volume general circulation model (fvGCM), supplemented by a copy of the Goddard cumulus ensemble model (GCE) at each of the fvGCM grid points, giving 13,104 GCE copies. The original fvGCM implementation has a 1D data decomposition; the revised MMF implementation retains the 1D decomposition for most of the code, but uses a 2D decomposition for the massive copies of GCEs. Because the vast majority of computation time in the MMF is spent computing the GCEs, this approach can achieve excellent speedup without incurring the cost of modifying the entire code. Intelligent process mapping allows differing numbers of processes to be assigned to each domain for load balancing. The revised parallel implementation shows highly promising scalability, obtaining a nearly 80-fold speedup by increasing the number of cores from 30 to 3,335.

  13. A Learning Framework for Control-Oriented Modeling of Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.; Vishnu, Abhinav; Vrabie, Draguna L.

    2018-01-18

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and big data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.

  14. TP-model transformation-based-control design frameworks

    CERN Document Server

    Baranyi, Péter

    2016-01-01

    This book covers new aspects and frameworks of control, design, and optimization based on the TP model transformation and its various extensions. The author outlines the three main steps of polytopic and LMI based control design: 1) development of the qLPV state-space model, 2) generation of the polytopic model; and 3) application of LMI to derive controller and observer. He goes on to describe why literature has extensively studied LMI design, but has not focused much on the second step, in part because the generation and manipulation of the polytopic form was not tractable in many cases. The author then shows how the TP model transformation facilitates this second step and hence reveals new directions, leading to powerful design procedures and the formulation of new questions. The chapters of this book, and the complex dynamical control tasks which they cover, are organized so as to present and analyze the beneficial aspect of the family of approaches (control, design, and optimization). Additionally, the b...

  15. A Multiple Reaction Modelling Framework for Microbial Electrochemical Technologies

    Directory of Open Access Journals (Sweden)

    Tolutola Oyetunde

    2017-01-01

    Full Text Available A mathematical model for the theoretical evaluation of microbial electrochemical technologies (METs is presented that incorporates a detailed physico-chemical framework, includes multiple reactions (both at the electrodes and in the bulk phase and involves a variety of microbial functional groups. The model is applied to two theoretical case studies: (i A microbial electrolysis cell (MEC for continuous anodic volatile fatty acids (VFA oxidation and cathodic VFA reduction to alcohols, for which the theoretical system response to changes in applied voltage and VFA feed ratio (anode-to-cathode as well as membrane type are investigated. This case involves multiple parallel electrode reactions in both anode and cathode compartments; (ii A microbial fuel cell (MFC for cathodic perchlorate reduction, in which the theoretical impact of feed flow rates and concentrations on the overall system performance are investigated. This case involves multiple electrode reactions in series in the cathode compartment. The model structure captures interactions between important system variables based on first principles and provides a platform for the dynamic description of METs involving electrode reactions both in parallel and in series and in both MFC and MEC configurations. Such a theoretical modelling approach, largely based on first principles, appears promising in the development and testing of MET control and optimization strategies.

  16. Testing a Conceptual Change Model Framework for Visual Data

    Science.gov (United States)

    Finson, Kevin D.; Pedersen, Jon E.

    2015-01-01

    An emergent data analysis technique was employed to test the veracity of a conceptual framework constructed around visual data use and instruction in science classrooms. The framework incorporated all five key components Vosniadou (2007a, 2007b) described as existing in a learner's schema: framework theory, presuppositions, conceptual domains,…

  17. The Cancer Cell Line Encyclopedia enables predictive modeling of anticancer drug sensitivity

    Science.gov (United States)

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A.; Kim, Sungjoon; Wilson, Christopher J.; Lehár, Joseph; Kryukov, Gregory V.; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F.; Monahan, John E.; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A.; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H.; Cheng, Jill; Yu, Guoying K.; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D.; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C.; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P.; Gabriel, Stacey B.; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E.; Weber, Barbara L.; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L.; Meyerson, Matthew; Golub, Todd R.; Morrissey, Michael P.; Sellers, William R.; Schlegel, Robert; Garraway, Levi A.

    2012-01-01

    The systematic translation of cancer genomic data into knowledge of tumor biology and therapeutic avenues remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacologic annotation is available1. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number, and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacologic profiles for 24 anticancer drugs across 479 of the lines, this collection allowed identification of genetic, lineage, and gene expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Altogether, our results suggest that large, annotated cell line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of “personalized” therapeutic regimens2. PMID:22460905

  18. The Cancer Cell Line Encyclopedia enables predictive modelling of anticancer drug sensitivity.

    Science.gov (United States)

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A; Kim, Sungjoon; Wilson, Christopher J; Lehár, Joseph; Kryukov, Gregory V; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F; Monahan, John E; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H; Cheng, Jill; Yu, Guoying K; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P; Gabriel, Stacey B; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E; Weber, Barbara L; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L; Meyerson, Matthew; Golub, Todd R; Morrissey, Michael P; Sellers, William R; Schlegel, Robert; Garraway, Levi A

    2012-03-28

    The systematic translation of cancer genomic data into knowledge of tumour biology and therapeutic possibilities remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacological annotation is available. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacological profiles for 24 anticancer drugs across 479 of the cell lines, this collection allowed identification of genetic, lineage, and gene-expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Together, our results indicate that large, annotated cell-line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of 'personalized' therapeutic regimens.

  19. Fullrmc, a rigid body Reverse Monte Carlo modeling package enabled with machine learning and artificial intelligence.

    Science.gov (United States)

    Aoun, Bachir

    2016-05-05

    A new Reverse Monte Carlo (RMC) package "fullrmc" for atomic or rigid body and molecular, amorphous, or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython, C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with a set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modeling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. In addition, fullrmc provides a unique way with almost no additional computational cost to recur a group's selection, allowing the system to go out of local minimas by refining a group's position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group. © 2016 Wiley Periodicals, Inc.

  20. Understanding Global Change: Frameworks and Models for Teaching Systems Thinking

    Science.gov (United States)

    Bean, J. R.; Mitchell, K.; Zoehfeld, K.; Oshry, A.; Menicucci, A. J.; White, L. D.; Marshall, C. R.

    2017-12-01

    The scientific and education communities must impart to teachers, students, and the public an understanding of how the various factors that drive climate and global change operate, and why the rates and magnitudes of these changes related to human perturbation of Earth system processes today are cause for deep concern. Even though effective educational modules explaining components of the Earth and climate system exist, interdisciplinary learning tools are necessary to conceptually link the causes and consequences of global changes. To address this issue, the Understanding Global Change Project at the University of California Museum of Paleontology (UCMP) at UC Berkeley developed an interdisciplinary framework that organizes global change topics into three categories: (1) causes of climate change, both human and non-human (e.g., burning of fossil fuels, deforestation, Earth's tilt and orbit), (2) Earth system processes that shape the way the Earth works (e.g., Earth's energy budget, water cycle), and (3) the measurable changes in the Earth system (e.g., temperature, precipitation, ocean acidification). To facilitate student learning about the Earth as a dynamic, interacting system, a website will provide visualizations of Earth system models and written descriptions of how each framework topic is conceptually linked to other components of the framework. These visualizations and textual summarizations of relationships and feedbacks in the Earth system are a unique and crucial contribution to science communication and education, informed by a team of interdisciplinary scientists and educators. The system models are also mechanisms by which scientists can communicate how their own work informs our understanding of the Earth system. Educators can provide context and relevancy for authentic datasets and concurrently can assess student understanding of the interconnectedness of global change phenomena. The UGC resources will be available through a web-based platform and

  1. Enabling Parametric Optimal Ascent Trajectory Modeling During Early Phases of Design

    Science.gov (United States)

    Holt, James B.; Dees, Patrick D.; Diaz, Manuel J.

    2015-01-01

    -modal due to the interaction of various constraints. Additionally, when these obstacles are coupled with The Program to Optimize Simulated Trajectories [1] (POST), an industry standard program to optimize ascent trajectories that is difficult to use, it requires expert trajectory analysts to effectively optimize a vehicle's ascent trajectory. As it has been pointed out, the paradigm of trajectory optimization is still a very manual one because using modern computational resources on POST is still a challenging problem. The nuances and difficulties involved in correctly utilizing, and therefore automating, the program presents a large problem. In order to address these issues, the authors will discuss a methodology that has been developed. The methodology is two-fold: first, a set of heuristics will be introduced and discussed that were captured while working with expert analysts to replicate the current state-of-the-art; secondly, leveraging the power of modern computing to evaluate multiple trajectories simultaneously, and therefore, enable the exploration of the trajectory's design space early during the pre-conceptual and conceptual phases of design. When this methodology is coupled with design of experiments in order to train surrogate models, the authors were able to visualize the trajectory design space, enabling parametric optimal ascent trajectory information to be introduced with other pre-conceptual and conceptual design tools. The potential impact of this methodology's success would be a fully automated POST evaluation suite for the purpose of conceptual and preliminary design trade studies. This will enable engineers to characterize the ascent trajectory's sensitivity to design changes in an arbitrary number of dimensions and for finding settings for trajectory specific variables, which result in optimal performance for a "dialed-in" launch vehicle design. The effort described in this paper was developed for the Advanced Concepts Office [2] at NASA Marshall

  2. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  3. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  4. Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-11-30

    A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).

  5. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    Science.gov (United States)

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  6. A Production Model for Construction: A Theoretical Framework

    Directory of Open Access Journals (Sweden)

    Ricardo Antunes

    2015-03-01

    Full Text Available The building construction industry faces challenges, such as increasing project complexity and scope requirements, but shorter deadlines. Additionally, economic uncertainty and rising business competition with a subsequent decrease in profit margins for the industry demands the development of new approaches to construction management. However, the building construction sector relies on practices based on intuition and experience, overlooking the dynamics of its production system. Furthermore, researchers maintain that the construction industry has no history of the application of mathematical approaches to model and manage production. Much work has been carried out on how manufacturing practices apply to construction projects, mostly lean principles. Nevertheless, there has been little research to understand the fundamental mechanisms of production in construction. This study develops an in-depth literature review to examine the existing knowledge about production models and their characteristics in order to establish a foundation for dynamic production systems management in construction. As a result, a theoretical framework is proposed, which will be instrumental in the future development of mathematical production models aimed at predicting the performance and behaviour of dynamic project-based systems in construction.

  7. A modeling framework for the design of collector wells.

    Science.gov (United States)

    Moore, Rhett; Kelson, Vic; Wittman, Jack; Rash, Vern

    2012-01-01

    We present results of a design study performed for the Saylorville Wellfield in Iowa, which is owned and operated by the Des Moines Water Works. The purpose of this study was to estimate wellfield capacity and provide a preliminary design for two radial collector wells to be constructed in the outwash aquifer along the Des Moines River near Saylorville, Iowa. After a field investigation to characterize the aquifer, regional two-dimensional and local three-dimensional, steady-state groundwater flow modeling was performed to locate and design the wells. This modeling was the foundation for design recommendations based on the relative performance of 12 collector well designs with varying lateral numbers, elevations, screen lengths, and orientations. For each site, alternate designs were evaluated based on model estimates of the capacity, the percent of surface water captured, and the production per unit length of screen. Many of our results are consistent with current design practices based on experience and intuition, but our methods allow for a quantitative approach for comparing alternate designs. Although the results are site-specific, the framework for evaluating the hydraulic design of the Saylorville radial collector wells is broadly applicable and could be used at other riverbank filtration sites. In addition, many of the conclusions from this design study may apply at other sites where construction of radial collector wells is being considered. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.

  8. Internal modelling under Risk-Based Capital (RBC) framework

    Science.gov (United States)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  9. Enabling immersive simulation.

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Josh (University of California Santa Cruz, Santa Cruz, CA); Mateas, Michael (University of California Santa Cruz, Santa Cruz, CA); Hart, Derek H.; Whetzel, Jonathan; Basilico, Justin Derrick; Glickman, Matthew R.; Abbott, Robert G.

    2009-02-01

    The object of the 'Enabling Immersive Simulation for Complex Systems Analysis and Training' LDRD has been to research, design, and engineer a capability to develop simulations which (1) provide a rich, immersive interface for participation by real humans (exploiting existing high-performance game-engine technology wherever possible), and (2) can leverage Sandia's substantial investment in high-fidelity physical and cognitive models implemented in the Umbra simulation framework. We report here on these efforts. First, we describe the integration of Sandia's Umbra modular simulation framework with the open-source Delta3D game engine. Next, we report on Umbra's integration with Sandia's Cognitive Foundry, specifically to provide for learning behaviors for 'virtual teammates' directly from observed human behavior. Finally, we describe the integration of Delta3D with the ABL behavior engine, and report on research into establishing the theoretical framework that will be required to make use of tools like ABL to scale up to increasingly rich and realistic virtual characters.

  10. A Diaminopropane-Appended Metal–Organic Framework Enabling Efficient CO 2 Capture from Coal Flue Gas via a Mixed Adsorption Mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Milner, Phillip J.; Siegelman, Rebecca L.; Forse, Alexander C.; Gonzalez, Miguel I.; Runčevski, Tomče [Materials; Martell, Jeffrey D.; Reimer, Jeffrey A.; Long, Jeffrey R. [Materials

    2017-09-14

    A new diamine-functionalized metal–organic framework comprised of 2,2-dimethyl-1,3-diaminopropane (dmpn) appended to the Mg2+ sites lining the channels of Mg2(dobpdc) (dobpdc4– = 4,4'-dioxidobiphenyl-3,3'-dicarboxylate) is characterized for the removal of CO2 from the flue gas emissions of coal-fired power plants. Unique to members of this promising class of adsorbents, dmpn–Mg2(dobpdc) displays facile step-shaped adsorption of CO2 from coal flue gas at 40 °C and near complete CO2 desorption upon heating to 100 °C, enabling a high CO2 working capacity (2.42 mmol/g, 9.1 wt %) with a modest 60 °C temperature swing. Evaluation of the thermodynamic parameters of adsorption for dmpn–Mg2(dobpdc) suggests that the narrow temperature swing of its CO2 adsorption steps is due to the high magnitude of its differential enthalpy of adsorption (Δhads = -73 ± 1 kJ/mol), with a larger than expected entropic penalty for CO2 adsorption (Δsads = -204 ± 4 J/mol·K) positioning the step in the optimal range for carbon capture from coal flue gas. In addition, thermogravimetric analysis and breakthrough experiments indicate that, in contrast to many adsorbents, dmpn–Mg2(dobpdc) captures CO2 effectively in the presence of water and can be subjected to 1000 humid adsorption/desorption cycles with minimal degradation. Solid-state 13C NMR spectra and single-crystal X-ray diffraction structures of the Zn analogue reveal that this material adsorbs CO2 via formation of both ammonium carbamates and carbamic acid pairs, the latter of which are crystallographically verified for the first time in a porous material. Taken together, these properties render dmpn–Mg2(dobpdc) one of the most promising adsorbents for carbon capture applications.

  11. Coding conventions and principles for a National Land-Change Modeling Framework

    Science.gov (United States)

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  12. Physical microscopic free-choice model in the framework of a Darwinian approach to quantum mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Baladron, Carlos [Departamento de Fisica Teorica, Atomica y Optica, Universidad de Valladolid, E-47011, Valladolid (Spain)

    2017-06-15

    A compatibilistic model of free choice for a fundamental particle is built within a general framework that explores the possibility that quantum mechanics be the emergent result of generalised Darwinian evolution acting on the abstract landscape of possible physical theories. The central element in this approach is a probabilistic classical Turing machine -basically an information processor plus a randomiser- methodologically associated with every fundamental particle. In this scheme every system acts not under a general law, but as a consequence of the command of a particular, evolved algorithm. This evolved programme enables the particle to algorithmically anticipate possible future world configurations in information space, and as a consequence, without altering the natural forward causal order in physical space, to incorporate elements to the decision making procedure that are neither purely random nor strictly in the past, but in a possible future. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  13. Enabling School Structure, Collective Responsibility, and a Culture of Academic Optimism: Toward a Robust Model of School Performance in Taiwan

    Science.gov (United States)

    Wu, Jason H.; Hoy, Wayne K.; Tarter, C. John

    2013-01-01

    Purpose: The purpose of this research is twofold: to test a theory of academic optimism in Taiwan elementary schools and to expand the theory by adding new variables, collective responsibility and enabling school structure, to the model. Design/methodology/approach: Structural equation modeling was used to test, refine, and expand an…

  14. Energy Consumption Model and Measurement Results for Network Coding-enabled IEEE 802.11 Meshed Wireless Networks

    DEFF Research Database (Denmark)

    Paramanathan, Achuthan; Rasmussen, Ulrik Wilken; Hundebøll, Martin

    2012-01-01

    This paper presents an energy model and energy measurements for network coding enabled wireless meshed networks based on IEEE 802.11 technology. The energy model and the energy measurement testbed is limited to a simple Alice and Bob scenario. For this toy scenario we compare the energy usages...

  15. AN INTEGRATED MODELING FRAMEWORK FOR CARBON MANAGEMENT TECHNOLOGIES

    Energy Technology Data Exchange (ETDEWEB)

    Anand B. Rao; Edward S. Rubin; Michael B. Berkenpas

    2004-03-01

    CO{sub 2} capture and storage (CCS) is gaining widespread interest as a potential method to control greenhouse gas emissions from fossil fuel sources, especially electric power plants. Commercial applications of CO{sub 2} separation and capture technologies are found in a number of industrial process operations worldwide. Many of these capture technologies also are applicable to fossil fuel power plants, although applications to large-scale power generation remain to be demonstrated. This report describes the development of a generalized modeling framework to assess alternative CO{sub 2} capture and storage options in the context of multi-pollutant control requirements for fossil fuel power plants. The focus of the report is on post-combustion CO{sub 2} capture using amine-based absorption systems at pulverized coal-fired plants, which are the most prevalent technology used for power generation today. The modeling framework builds on the previously developed Integrated Environmental Control Model (IECM). The expanded version with carbon sequestration is designated as IECM-cs. The expanded modeling capability also includes natural gas combined cycle (NGCC) power plants and integrated coal gasification combined cycle (IGCC) systems as well as pulverized coal (PC) plants. This report presents details of the performance and cost models developed for an amine-based CO{sub 2} capture system, representing the baseline of current commercial technology. The key uncertainties and variability in process design, performance and cost parameters which influence the overall cost of carbon mitigation also are characterized. The new performance and cost models for CO{sub 2} capture systems have been integrated into the IECM-cs, along with models to estimate CO{sub 2} transport and storage costs. The CO{sub 2} control system also interacts with other emission control technologies such as flue gas desulfurization (FGD) systems for SO{sub 2} control. The integrated model is applied to

  16. An integrated end-to-end modeling framework for testing ecosystem-wide effects of human-induced pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Nielsen, J. Rasmus; Christensen, Asbjørn

    We present an integrated end-to-end modeling framework that enables whole-of ecosystem climate, eutrophication, and spatial management scenario exploration in the Baltic Sea. The framework is built around the Baltic implementation of the spatially-explicit end-to-end ATLANTIS model, linked...... with respect to the underlying assumptions, strengths and weaknesses of individual models. Furthermore, we describe how to possibly expand the framework to account for spatial impacts and economic consequences, for instance by linking to the individual-vessel based DISPLACE modeling approach. We conclude...... that the proposed model integration and management scenario evaluation scheme lays the foundations for developing a robust framework for management strategy evaluation that is of strategic importance to stakeholders from around the Baltic Sea....

  17. SCaLeM: A Framework for Characterizing and Analyzing Execution Models

    Energy Technology Data Exchange (ETDEWEB)

    Chavarría-Miranda, Daniel; Manzano Franco, Joseph B.; Krishnamoorthy, Sriram; Vishnu, Abhinav; Barker, Kevin J.; Hoisie, Adolfy

    2014-10-13

    As scalable parallel systems evolve towards more complex nodes with many-core architectures and larger trans-petascale & upcoming exascale deployments, there is a need to understand, characterize and quantify the underlying execution models being used on such systems. Execution models are a conceptual layer between applications & algorithms and the underlying parallel hardware and systems software on which those applications run. This paper presents the SCaLeM (Synchronization, Concurrency, Locality, Memory) framework for characterizing and execution models. SCaLeM consists of three basic elements: attributes, compositions and mapping of these compositions to abstract parallel systems. The fundamental Synchronization, Concurrency, Locality and Memory attributes are used to characterize each execution model, while the combinations of those attributes in the form of compositions are used to describe the primitive operations of the execution model. The mapping of the execution model’s primitive operations described by compositions, to an underlying abstract parallel system can be evaluated quantitatively to determine its effectiveness. Finally, SCaLeM also enables the representation and analysis of applications in terms of execution models, for the purpose of evaluating the effectiveness of such mapping.

  18. A Global Modeling Framework for Plasma Kinetics: Development and Applications

    Science.gov (United States)

    Parsey, Guy Morland

    The modern study of plasmas, and applications thereof, has developed synchronously with com- puter capabilities since the mid-1950s. Complexities inherent to these charged-particle, many- body, systems have resulted in the development of multiple simulation methods (particle-in-cell, fluid, global modeling, etc.) in order to both explain observed phenomena and predict outcomes of plasma applications. Recognizing that different algorithms are chosen to best address specific topics of interest, this thesis centers around the development of an open-source global model frame- work for the focused study of non-equilibrium plasma kinetics. After verification and validation of the framework, it was used to study two physical phenomena: plasma-assisted combustion and the recently proposed optically-pumped rare gas metastable laser. Global models permeate chemistry and plasma science, relying on spatial averaging to focus attention on the dynamics of reaction networks. Defined by a set of species continuity and energy conservation equations, the required data and constructed systems are conceptually similar across most applications, providing a light platform for exploratory and result-search parameter scan- ning. Unfortunately, it is common practice for custom code to be developed for each application-- an enormous duplication of effort which negatively affects the quality of the software produced. Presented herein, the Python-based Kinetic Global Modeling framework (KGMf) was designed to support all modeling phases: collection and analysis of reaction data, construction of an exportable system of model ODEs, and a platform for interactive evaluation and post-processing analysis. A symbolic ODE system is constructed for interactive manipulation and generation of a Jacobian, both of which are compiled as operation-optimized C-code. Plasma-assisted combustion and ignition (PAC/PAI) embody the modernization of burning fuel by opening up new avenues of control and optimization

  19. Python framework for kinetic modeling of electronically excited reaction pathways

    Science.gov (United States)

    Verboncoeur, John; Parsey, Guy; Guclu, Yaman; Christlieb, Andrew

    2012-10-01

    The use of plasma energy to enhance and control the chemical reactions during combustion, a technology referred to as ``plasma assisted combustion'' (PAC), can result in a variety of beneficial effects: e.g. stable lean operation, pollution reduction, and wider range of p-T operating conditions. While experimental evidence abounds, theoretical understanding of PAC is at best incomplete, and numerical tools still lack in reliable predictive capabilities. In the context of a joint experimental-numerical effort at Michigan State University, we present here an open-source modular Python framework dedicated to the dynamic optimization of non-equilibrium PAC systems. Multiple sources of experimental reaction data, e.g. reaction rates, cross-sections and oscillator strengths, are used in order to quantify the effect of data uncertainty and limiting assumptions. A collisional-radiative model (CRM) is implemented to organize reactions by importance and as a potential means of measuring a non-Maxwellian electron energy distribution function (EEDF), when coupled to optical emission spectroscopy data. Finally, we explore scaling laws in PAC parameter space using a kinetic global model (KGM) accelerated with CRM optimized reaction sequences and sparse stiff integrators.

  20. Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework

    Science.gov (United States)

    Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…

  1. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task......Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  2. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  3. A new climate modeling framework for convection-resolving simulation at continental scale

    Science.gov (United States)

    Charpilloz, Christophe; di Girolamo, Salvatore; Arteaga, Andrea; Fuhrer, Oliver; Hoefler, Torsten; Schulthess, Thomas; Schär, Christoph

    2017-04-01

    Major uncertainties remain in our understanding of the processes that govern the water cycle in a changing climate and their representation in weather and climate models. Of particular concern are heavy precipitation events of convective origin (thunderstorms and rain showers). The aim of the crCLIM project [1] is to propose a new climate modeling framework that alleviates the I/O-bottleneck in large-scale, convection-resolving climate simulations and thus to enable new analysis techniques for climate scientists. Due to the large computational costs, convection-resolving simulations are currently restricted to small computational domains or very short time scales, unless the largest available supercomputers system such as hybrid CPU-GPU architectures are used [3]. Hence, the COSMO model has been adapted to run on these architectures for research and production purposes [2]. However, the amount of generated data also increases and storing this data becomes infeasible making the analysis of simulations results impractical. To circumvent this problem and enable high-resolution models in climate we propose a data-virtualization layer (DVL) that re-runs simulations on demand and transparently manages the data for the analysis, that means we trade off computational effort (time) for storage (space). This approach also requires a bit-reproducible version of the COSMO model that produces identical results on different architectures (CPUs and GPUs) [4] that will be coupled with a performance model in order enable optimal re-runs depending on requirements of the re-run and available resources. In this contribution, we discuss the strategy to develop the DVL, a first performance model, the challenge of bit-reproducibility and the first results of the crCLIM project. [1] http://www.c2sm.ethz.ch/research/crCLIM.html [2] O. Fuhrer, C. Osuna, X. Lapillonne, T. Gysi, M. Bianco, and T. Schulthess. "Towards gpu-accelerated operational weather forecasting." In The GPU Technology

  4. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    OpenAIRE

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models marketing performance as a sequence of intermediate performance measures ultimately leading to financial performance. This framework, called the Hierarchical Marketing Performance (HMP) framework, starts ...

  5. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    Science.gov (United States)

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

  6. Exploring Higher Education Governance: Analytical Models and Heuristic Frameworks

    Directory of Open Access Journals (Sweden)

    Burhan FINDIKLI

    2017-08-01

    Full Text Available Governance in higher education, both at institutional and systemic levels, has experienced substantial changes within recent decades because of a range of world-historical processes such as massification, growth, globalization, marketization, public sector reforms, and the emergence of knowledge economy and society. These developments have made governance arrangements and decision-making processes in higher education more complex and multidimensional more than ever and forced scholars to build new analytical and heuristic tools and strategies to grasp the intricacy and diversity of higher education governance dynamics. This article provides a systematic discussion of how and through which tools prominent scholars of higher education have analyzed governance in this sector by examining certain heuristic frameworks and analytical models. Additionally, the article shows how social scientific analysis of governance in higher education has proceeded in a cumulative way with certain revisions and syntheses rather than radical conceptual and theoretical ruptures from Burton R. Clark’s seminal work to the present, revealing conceptual and empirical junctures between them.

  7. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    OpenAIRE

    Paweł Sitek; Jarosław Wikarek

    2016-01-01

    This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs) and constraint optimization problems (COPs). Two paradigms, CLP (constraint logic programming) and MP (mathematical programming), are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework a...

  8. Modeling Framework and Results to Inform Charging Infrastructure Investments

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric W [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-01

    The plug-in electric vehicle (PEV) market is experiencing rapid growth with dozens of battery electric (BEV) and plug-in hybrid electric (PHEV) models already available and billions of dollars being invested by automotive manufacturers in the PEV space. Electric range is increasing thanks to larger and more advanced batteries and significant infrastructure investments are being made to enable higher power fast charging. Costs are falling and PEVs are becoming more competitive with conventional vehicles. Moreover, new technologies such as connectivity and automation hold the promise of enhancing the value proposition of PEVs. This presentation outlines a suite of projects funded by the U.S. Department of Energy's Vehicle Technology Office to conduct assessments of the economic value and charging infrastructure requirements of the evolving PEV market. Individual assessments include national evaluations of PEV economic value (assuming 73M PEVs on the road in 2035), national analysis of charging infrastructure requirements (with community and corridor level resolution), and case studies of PEV ownership in Columbus, OH and Massachusetts.

  9. Modelling Supported Driving as an Optimal Control Cycle : Framework and Model Characteristics

    NARCIS (Netherlands)

    Wang, M.; Treiber, M.; Daamen, W.; Hoogendoorn, S.P.; Van Arem, B.

    2013-01-01

    Driver assistance systems support drivers in operating vehicles in a safe, comfortable and efficient way, and thus may induce changes in traffic flow characteristics. This paper puts forward a receding horizon control framework to model driver assistance and cooperative systems. The accelerations of

  10. A device model framework for magnetoresistive sensors based on the Stoner–Wohlfarth model

    International Nuclear Information System (INIS)

    Bruckner, Florian; Bergmair, Bernhard; Brueckl, Hubert; Palmesi, Pietro; Buder, Anton; Satz, Armin; Suess, Dieter

    2015-01-01

    The Stoner–Wohlfarth (SW) model provides an efficient analytical model to describe the behavior of magnetic layers within magnetoresistive sensors. Combined with a proper description of magneto-resistivity an efficient device model can be derived, which is necessary for an optimal electric circuit design. Parameters of the model are determined by global optimization of an application specific cost function which contains measured resistances for different applied fields. Several application cases are examined and used for validation of the device model. - Highlights: • An efficient device model framework for various types of magnetoresistive sensors is presented. • The model is based on the analytical solution of the Stoner–Wohlfarth model. • Numerical optimization methods provide optimal model parameters for a different application cases. • The model is applied to several application cases and is able to reproduce measured hysteresis and swiching behavior

  11. An Airpower Application Framework: Modeling Coercive Airpower Strategies

    National Research Council Canada - National Science Library

    Weigand, Anthony

    1998-01-01

    This study focuses on the development of a theoretical framework for the application of coercive airpower strategies that can be used in the construction of a decision aid for use by airpower strategists...

  12. How to make more out of community data? A conceptual framework and its implementation as models and software.

    Science.gov (United States)

    Ovaskainen, Otso; Tikhonov, Gleb; Norberg, Anna; Guillaume Blanchet, F; Duan, Leo; Dunson, David; Roslin, Tomas; Abrego, Nerea

    2017-05-01

    Community ecology aims to understand what factors determine the assembly and dynamics of species assemblages at different spatiotemporal scales. To facilitate the integration between conceptual and statistical approaches in community ecology, we propose Hierarchical Modelling of Species Communities (HMSC) as a general, flexible framework for modern analysis of community data. While non-manipulative data allow for only correlative and not causal inference, this framework facilitates the formulation of data-driven hypotheses regarding the processes that structure communities. We model environmental filtering by variation and covariation in the responses of individual species to the characteristics of their environment, with potential contingencies on species traits and phylogenetic relationships. We capture biotic assembly rules by species-to-species association matrices, which may be estimated at multiple spatial or temporal scales. We operationalise the HMSC framework as a hierarchical Bayesian joint species distribution model, and implement it as R- and Matlab-packages which enable computationally efficient analyses of large data sets. Armed with this tool, community ecologists can make sense of many types of data, including spatially explicit data and time-series data. We illustrate the use of this framework through a series of diverse ecological examples. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  13. Airline Sustainability Modeling: A New Framework with Application of Bayesian Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Hashem Salarzadeh Jenatabadi

    2016-11-01

    Full Text Available There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM with maximum likelihood and Bayesian predictors. The introduced framework includes economic performance, operational performance, cost performance, and financial performance. Based on both Bayesian SEM (Bayesian-SEM and Classical SEM (Classical-SEM, it was found that economic performance with both operational performance and cost performance are significantly related to the financial performance index. The four mathematical indices employed are root mean square error, coefficient of determination, mean absolute error, and mean absolute percentage error to compare the efficiency of Bayesian-SEM and Classical-SEM in predicting the airline financial performance. The outputs confirmed that the framework with Bayesian prediction delivered a good fit with the data, although the framework predicted with a Classical-SEM approach did not prepare a well-fitting model. The reasons for this discrepancy between Classical and Bayesian predictions, as well as the potential advantages and caveats with the application of Bayesian approach in airline sustainability studies, are debated.

  14. Enabling Wireless Power Transfer in Cellular Networks: Architecture, Modeling and Deployment

    OpenAIRE

    Huang, Kaibin; Lau, Vincent K. N.

    2012-01-01

    Microwave power transfer (MPT) delivers energy wirelessly from stations called power beacons (PBs) to mobile devices by microwave radiation. This provides mobiles practically infinite battery lives and eliminates the need of power cords and chargers. To enable MPT for mobile charging, this paper proposes a new network architecture that overlays an uplink cellular network with randomly deployed PBs for powering mobiles, called a hybrid network. The deployment of the hybrid network under an out...

  15. Enabling Integrated Decision Making for Electronic-Commerce by Modelling an Enterprise's Sharable Knowledge.

    Science.gov (United States)

    Kim, Henry M.

    2000-01-01

    An enterprise model, a computational model of knowledge about an enterprise, is a useful tool for integrated decision-making by e-commerce suppliers and customers. Sharable knowledge, once represented in an enterprise model, can be integrated by the modeled enterprise's e-commerce partners. Presents background on enterprise modeling, followed by…

  16. A Physics-Based Modeling Framework for Prognostic Studies

    Science.gov (United States)

    Kulkarni, Chetan S.

    2014-01-01

    Prognostics and Health Management (PHM) methodologies have emerged as one of the key enablers for achieving efficient system level maintenance as part of a busy operations schedule, and lowering overall life cycle costs. PHM is also emerging as a high-priority issue in critical applications, where the focus is on conducting fundamental research in the field of integrated systems health management. The term diagnostics relates to the ability to detect and isolate faults or failures in a system. Prognostics on the other hand is the process of predicting health condition and remaining useful life based on current state, previous conditions and future operating conditions. PHM methods combine sensing, data collection, interpretation of environmental, operational, and performance related parameters to indicate systems health under its actual application conditions. The development of prognostics methodologies for the electronics field has become more important as more electrical systems are being used to replace traditional systems in several applications in the aeronautics, maritime, and automotive fields. The development of prognostics methods for electronics presents several challenges due to the great variety of components used in a system, a continuous development of new electronics technologies, and a general lack of understanding of how electronics fail. Similarly with electric unmanned aerial vehicles, electrichybrid cars, and commercial passenger aircraft, we are witnessing a drastic increase in the usage of batteries to power vehicles. However, for battery-powered vehicles to operate at maximum efficiency and reliability, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. We develop an electrochemistry-based model of Li-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable

  17. On joint deterministic grid modeling and sub-grid variability conceptual framework for model evaluation

    Science.gov (United States)

    Ching, Jason; Herwehe, Jerold; Swall, Jenise

    The general situation (but exemplified in urban areas), where a significant degree of sub-grid variability (SGV) exists in grid models poses problems when comparing grid-based air-quality modeling results with observations. Typically, grid models ignore or parameterize processes and features that are at their sub-grid scale. Also, observations may be obtained in an area where significant spatial variability in the concentration fields exists. Consequently, model results and observations cannot be expected to be equal. To address this issue, we suggest a framework that can provide for qualitative judgments on model performance based on comparing observations to the grid predictions and its SGV distribution. Further, we (a) explore some characteristics of SGV, (b) comment on the contributions to SGV and (c) examine the implications to the modeling results at coarse grid resolution using examples from fine scale grid modeling of the Community Multi-scale Air Quality (CMAQ) modeling system.

  18. A Modeling Framework for the Evolution and Spread of Antibiotic Resistance: Literature Review and Model Categorization

    Science.gov (United States)

    Spicknall, Ian H.; Foxman, Betsy; Marrs, Carl F.; Eisenberg, Joseph N. S.

    2013-01-01

    Antibiotic-resistant infections complicate treatment and increase morbidity and mortality. Mathematical modeling has played an integral role in improving our understanding of antibiotic resistance. In these models, parameter sensitivity is often assessed, while model structure sensitivity is not. To examine the implications of this, we first reviewed the literature on antibiotic-resistance modeling published between 1993 and 2011. We then classified each article's model structure into one or more of 6 categories based on the assumptions made in those articles regarding within-host and population-level competition between antibiotic-sensitive and antibiotic-resistant strains. Each model category has different dynamic implications with respect to how antibiotic use affects resistance prevalence, and therefore each may produce different conclusions about optimal treatment protocols that minimize resistance. Thus, even if all parameter values are correctly estimated, inferences may be incorrect because of the incorrect selection of model structure. Our framework provides insight into model selection. PMID:23660797

  19. A modeling framework for the evolution and spread of antibiotic resistance: literature review and model categorization.

    Science.gov (United States)

    Spicknall, Ian H; Foxman, Betsy; Marrs, Carl F; Eisenberg, Joseph N S

    2013-08-15

    Antibiotic-resistant infections complicate treatment and increase morbidity and mortality. Mathematical modeling has played an integral role in improving our understanding of antibiotic resistance. In these models, parameter sensitivity is often assessed, while model structure sensitivity is not. To examine the implications of this, we first reviewed the literature on antibiotic-resistance modeling published between 1993 and 2011. We then classified each article's model structure into one or more of 6 categories based on the assumptions made in those articles regarding within-host and population-level competition between antibiotic-sensitive and antibiotic-resistant strains. Each model category has different dynamic implications with respect to how antibiotic use affects resistance prevalence, and therefore each may produce different conclusions about optimal treatment protocols that minimize resistance. Thus, even if all parameter values are correctly estimated, inferences may be incorrect because of the incorrect selection of model structure. Our framework provides insight into model selection.

  20. Addressing Energy System Modelling Challenges: The Contribution of the Open Energy Modelling Framework (oemof)

    DEFF Research Database (Denmark)

    Hilpert, Simon; Günther, Stephan; Kaldemeyer, Cord

    2017-01-01

    complexity of energy systems and high uncertainties on different levels. In addition, interdisciplinary modelling is necessary for getting insight in mechanisms of an integrated world. At the same time models need to meet scientific standards as public acceptance becomes increasingly important......The process of modelling energy systems is accompanied by challenges inherently connected with mathematical modelling. However, due to modern realities in the 21st century, existing challenges are gaining in magnitude and are supplemented with new ones. Modellers are confronted with a rising....... In this intricate environment model application as well as result communication and interpretation is also getting more difficult. In this paper we present the open energy modelling framework (oemof) as a novel approach for energy system modelling and derive its contribution to existing challenges. Therefore, based...

  1. Theories, models and frameworks used in capacity building interventions relevant to public health: a systematic review.

    Science.gov (United States)

    Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather

    2017-11-28

    There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners

  2. A generic framework for individual-based modelling and physical-biological interaction

    DEFF Research Database (Denmark)

    Christensen, Asbjørn; Mariani, Patrizio; Payne, Mark R.

    2018-01-01

    , comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute...

  3. Strategic assessment of capacity consumption in railway networks: Framework and model

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex; Nielsen, Otto Anker

    2017-01-01

    In this paper, we develop a new framework for strategic planning purposes to calculate railway infrastructure occupation and capacity consumption in networks, independent of a timetable. Furthermore, a model implementing the framework is presented. In this model different train sequences are gene...

  4. Designing a framework to design a business model for the 'bottom of the pyramid' population

    NARCIS (Netherlands)

    Ver loren van Themaat, Tanye; Schutte, Cornelius S.L.; Lutters, Diederick

    2013-01-01

    This article presents a framework for developing and designing a business model to target the bottom of the pyramid (BoP) population. Using blue ocean strategy and business model literature, integrated with research on the BoP, the framework offers a systematic approach for organisations to analyse

  5. Real-time models for wheels and tyres in an object-oriented modelling framework

    Science.gov (United States)

    Zimmer, Dirk; Otter, Martin

    2010-02-01

    This article presents models for wheels and tyres in the application field of real-time multi-body systems. For this rather broad class of applications it is difficult to foresee the right level of model complexity that is affordable in a specific simulation. Therefore we developed a tyre model that is adjustable in its degree of complexity. It consists of a list of stepwise developed sub-models, each at a higher level of complexity. These models include semi-empirical equations. The stepwise development process is also reflected in the corresponding implementation with the modelling language Modelica. The final wheel model represents a supermodel and enables users to select the right level of complexity in an unambiguous way.

  6. The Foundations Framework for Developing and Reporting New Models of Care for Multimorbidity.

    Science.gov (United States)

    Stokes, Jonathan; Man, Mei-See; Guthrie, Bruce; Mercer, Stewart W; Salisbury, Chris; Bower, Peter

    2017-11-01

    Multimorbidity challenges health systems globally. New models of care are urgently needed to better manage patients with multimorbidity; however, there is no agreed framework for designing and reporting models of care for multimorbidity and their evaluation. Based on findings from a literature search to identify models of care for multimorbidity, we developed a framework to describe these models. We illustrate the application of the framework by identifying the focus and gaps in current models of care, and by describing the evolution of models over time. Our framework describes each model in terms of its theoretical basis and target population (the foundations of the model) and of the elements of care implemented to deliver the model. We categorized elements of care into 3 types: (1) clinical focus, (2) organization of care, (3) support for model delivery. Application of the framework identified a limited use of theory in model design and a strong focus on some patient groups (elderly, high users) more than others (younger patients, deprived populations). We found changes in elements with time, with a decrease in models implementing home care and an increase in models offering extended appointments. By encouragin greater clarity about the underpinning theory and target population, and by categorizing the wide range of potentially important elements of an intervention to improve care for patients with multimorbidity, the framework may be useful in designing and reporting models of care and help advance the currently limited evidence base. © 2017 Annals of Family Medicine, Inc.

  7. A social-ecological framework: A model for addressing ethical practice in nursing.

    Science.gov (United States)

    Davidson, Patricia; Rushton, Cynda Hylton; Kurtz, Melissa; Wise, Brian; Jackson, Debra; Beaman, Adam; Broome, Marion

    2018-03-01

    To develop a framework to enable discussion, debate and the formulation of interventions to address ethical issues in nursing practice. Social, cultural, political and economic drivers are rapidly changing the landscape of health care in our local environments but also in a global context. Increasingly, nurses are faced with a range of ethical dilemmas in their work. This requires investigation into the culture of healthcare systems and organisations to identify the root causes and address the barriers and enablers of ethical practice. The increased medicalisation of health care; pressures for systemisation; efficiency and cost reduction; and an ageing population contribute to this complexity. Often, ethical issues in nursing are considered within the abstract and philosophical realm until a dilemma is encountered. Such an approach limits the capacity to tangibly embrace ethical values and frameworks as pathways to equitable, accessible, safe and quality health care and as a foundation for strengthening a supportive and enabling workplace for nurses and other healthcare workers. Conceptual framework development. A comprehensive literature review was undertaken using the social-ecological framework as an organising construct. This framework views ethical practice as the outcome of interaction among a range of factors at eight levels: individual factors (patients and families); individual factors (nurses); relationships between healthcare professionals; relationships between patients and nurses; organisational healthcare context; professional and education regulation and standards; community; and social, political and economic. Considering these elements as discrete, yet interactive and intertwined forces can be useful in developing interventions to promote ethical practice. We consider this framework to have utility in policy, practice, education and research. Nurses face ethical challenges on a daily basis, considering these within a social-ecological framework can

  8. Professional Development Recognizing Technology Integration Modeled after the TPACK Framework

    Science.gov (United States)

    McCusker, Laura

    2017-01-01

    Public school teachers within a Pennsylvania intermediate unit are receiving inadequate job-embedded professional development that recognizes knowledge of content, pedagogy, and technology integration, as outlined by Mishra and Koehler's Technological Pedagogical Content Knowledge (TPACK) framework (2006). A school environment where teachers are…

  9. Framework for Modelling Multiple Input Complex Aggregations for Interactive Installations

    DEFF Research Database (Denmark)

    Padfield, Nicolas; Andreasen, Troels

    2012-01-01

    We describe a generalized framework as a method and design tool for creating interactive installations with a demand for exploratory meaning creation, not limited to the design stage, but extending into the stage where the installation meets participants and audience. The proposed solution is bas...

  10. Enabling full-field physics-based optical proximity correction via dynamic model generation

    Science.gov (United States)

    Lam, Michael; Clifford, Chris; Raghunathan, Ananthan; Fenger, Germain; Adam, Kostas

    2017-07-01

    As extreme ultraviolet lithography becomes closer to reality for high volume production, its peculiar modeling challenges related to both inter and intrafield effects have necessitated building an optical proximity correction (OPC) infrastructure that operates with field position dependency. Previous state-of-the-art approaches to modeling field dependency used piecewise constant models where static input models are assigned to specific x/y-positions within the field. OPC and simulation could assign the proper static model based on simulation-level placement. However, in the realm of 7 and 5 nm feature sizes, small discontinuities in OPC from piecewise constant model changes can cause unacceptable levels of edge placement errors. The introduction of dynamic model generation (DMG) can be shown to effectively avoid these dislocations by providing unique mask and optical models per simulation region, allowing a near continuum of models through the field. DMG allows unique models for electromagnetic field, apodization, aberrations, etc. to vary through the entire field and provides a capability to precisely and accurately model systematic field signatures.

  11. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  12. Modeling Extreme Precipitation over East China with a Global Variable-Resolution Modeling Framework (MPAS)

    Science.gov (United States)

    Zhao, C.; Xu, M.; Wang, Y.; Guo, J.; Hu, Z.; Ruby, L.; Duda, M.; Skamarock, W. C.

    2017-12-01

    Modeling extreme precipitation requires high-resolution scales. Traditional regional downscaling modeling framework has some issues such as ill-posed boundary conditions, mismatches between the driving global and regional dynamics and physics, and the lack of regional feedback to global scales. The non-hydrostatic Model for Prediction Across Scales (MPAS), a global variable-resolution modeling framework, offers an opportunity to obtain regional features at high-resolution scales using regional mesh refinement without boundary limiting. In this study, the MPAS model is first time applied with the refined meshes over East China at various high-resolutions (16 km and 4 km) to simulate an extreme precipitation event during 26-27 June 2012. The simulations are evaluated with the ground observations from the Chinese Meteorological Administration (CMA) network and the reanalysis data. Sensitivity experiments with different physics and forecast lead time are conducted to understand the uncertainties in simulating spatial and temporal variation of precipitation. The variable-resolution simulations are also compared with the traditional global uniform-resolution simulations at a relatively low scale ( 30 km) and a relatively high scale ( 16 km). The analysis shows that the variable-resolution simulation can capture the high-scale feature of precipitation over East China as the uniform-resolution simulation at a relatively high scale. It also indicates that high-resolution significantly improves the capability of simulating extreme precipitation. The MPAS simulations are also compared with the traditional limited-area simulations at similar scales using the Weather Research and Forecasting Model (WRF). The difference between the simulations using these two different modeling framework is also discussed.

  13. Autogenerator-based modelling framework for development of strategic games simulations: rational pigs game extended.

    Science.gov (United States)

    Fabac, Robert; Radošević, Danijel; Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, "the tramp," one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players.

  14. The unified model of vegetarian identity: A conceptual framework for understanding plant-based food choices.

    Science.gov (United States)

    Rosenfeld, Daniel L; Burrow, Anthony L

    2017-05-01

    By departing from social norms regarding food behaviors, vegetarians acquire membership in a distinct social group and can develop a salient vegetarian identity. However, vegetarian identities are diverse, multidimensional, and unique to each individual. Much research has identified fundamental psychological aspects of vegetarianism, and an identity framework that unifies these findings into common constructs and conceptually defines variables is needed. Integrating psychological theories of identity with research on food choices and vegetarianism, this paper proposes a conceptual model for studying vegetarianism: The Unified Model of Vegetarian Identity (UMVI). The UMVI encompasses ten dimensions-organized into three levels (contextual, internalized, and externalized)-that capture the role of vegetarianism in an individual's self-concept. Contextual dimensions situate vegetarianism within contexts; internalized dimensions outline self-evaluations; and externalized dimensions describe enactments of identity through behavior. Together, these dimensions form a coherent vegetarian identity, characterizing one's thoughts, feelings, and behaviors regarding being vegetarian. By unifying dimensions that capture psychological constructs universally, the UMVI can prevent discrepancies in operationalization, capture the inherent diversity of vegetarian identities, and enable future research to generate greater insight into how people understand themselves and their food choices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Systemic therapy and the social relational model of disability: enabling practices with people with intellectual disability

    OpenAIRE

    Haydon-Laurelut, Mark

    2009-01-01

    Therapy has been critiqued for personalizing the political (Kitzinger, 1993). The social-relational model (Thomas, 1999) is one theoretical resource for understanding the practices of therapy through a political lens. The social model(s) have viewed therapy with suspicion. This paper highlights – using composite case examples and the authors primary therapeutic modality, systemic therapy – some systemic practices with adults with Intellectual Disability (ID) that enact a position that it is s...

  16. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    Science.gov (United States)

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  17. From Principles to Details: Integrated Framework for Architecture Modelling of Large Scale Software Systems

    Directory of Open Access Journals (Sweden)

    Andrzej Zalewski

    2013-06-01

    Full Text Available There exist numerous models of software architecture (box models, ADL’s, UML, architectural decisions, architecture modelling frameworks (views, enterprise architecture frameworks and even standards recommending practice for the architectural description. We show in this paper, that there is still a gap between these rather abstract frameworks/standards and existing architecture models. Frameworks and standards define what should be modelled rather than which models should be used and how these models are related to each other. We intend to prove that a less abstract modelling framework is needed for the effective modelling of large scale software intensive systems. It should provide a more precise guidance kinds of models to be employed and how they should relate to each other. The paper defines principles that can serve as base for an integrated model. Finally, structure of such a model has been proposed. It comprises three layers: the upper one – architectural policy – reflects corporate policy and strategies in architectural terms, the middle one –system organisation pattern – represents the core structural concepts and their rationale at a given level of scope, the lower one contains detailed architecture models. Architectural decisions play an important role here: they model the core architectural concepts explaining detailed models as well as organise the entire integrated model and the relations between its submodels.

  18. Investigating dye performance and crosstalk in fluorescence enabled bioimaging using a model system

    DEFF Research Database (Denmark)

    Arppe, Riikka; R. Carro-Temboury, Miguel; Hempel, Casper

    2017-01-01

    studies and between research groups very difficult. Therefore, we suggest a model system to benchmark instrumentation, methods and staining procedures. The system we introduce is based on doped zeolites in stained polyvinyl alcohol (PVA) films: a highly accessible model system which has the properties......-talk of fluorophores on the detected fluorescence signal. The described model system comprises of lanthanide (III) ion doped Linde Type A zeolites dispersed in a PVA film stained with fluorophores. We tested: F18, MitoTracker Red and ATTO647N. This model system allowed comparing performance of the fluorophores...

  19. Framework for product knowledge and product related knowledge which supports product modelling for mass customization

    DEFF Research Database (Denmark)

    Riis, Jesper; Hansen, Benjamin Loer; Hvam, Lars

    2003-01-01

    and personalization. The framework for product knowledge and product related knowledge is based on the following theories: axiomatic design, technical systems, theory of domains, theory of structuring, theory of properties and the framework for the content of product and product related models. The framework is built...... and product related knowledge which should be or should not be included in the model. This demarcation will have a large influence on the structure of the IT systems (for example the configurator system, the CAD system or the PDM system). • The use of the framework can help achieve more structured models......The article presents a framework for product knowledge and product related knowledge which can be used to support the product modelling process which is needed for developing IT systems. These IT systems are important tools for many companies when they aim at achieving mass customization...

  20. A prototype framework for models of socio-hydrology: identification of key feedback loops and parameterisation approach

    Science.gov (United States)

    Elshafei, Y.; Sivapalan, M.; Tonts, M.; Hipsey, M. R.

    2014-06-01

    It is increasingly acknowledged that, in order to sustainably manage global freshwater resources, it is critical that we better understand the nature of human-hydrology interactions at the broader catchment system scale. Yet to date, a generic conceptual framework for building models of catchment systems that include adequate representation of socioeconomic systems - and the dynamic feedbacks between human and natural systems - has remained elusive. In an attempt to work towards such a model, this paper outlines a generic framework for models of socio-hydrology applicable to agricultural catchments, made up of six key components that combine to form the coupled system dynamics: namely, catchment hydrology, population, economics, environment, socioeconomic sensitivity and collective response. The conceptual framework posits two novel constructs: (i) a composite socioeconomic driving variable, termed the Community Sensitivity state variable, which seeks to capture the perceived level of threat to a community's quality of life, and acts as a key link tying together one of the fundamental feedback loops of the coupled system, and (ii) a Behavioural Response variable as the observable feedback mechanism, which reflects land and water management decisions relevant to the hydrological context. The framework makes a further contribution through the introduction of three macro-scale parameters that enable it to normalise for differences in climate, socioeconomic and political gradients across study sites. In this way, the framework provides for both macro-scale contextual parameters, which allow for comparative studies to be undertaken, and catchment-specific conditions, by way of tailored "closure relationships", in order to ensure that site-specific and application-specific contexts of socio-hydrologic problems can be accommodated. To demonstrate how such a framework would be applied, two socio-hydrological case studies, taken from the Australian experience, are presented

  1. Identifying the barriers and enablers for a triage, treatment, and transfer clinical intervention to manage acute stroke patients in the emergency department: a systematic review using the theoretical domains framework (TDF).

    Science.gov (United States)

    Craig, Louise E; McInnes, Elizabeth; Taylor, Natalie; Grimley, Rohan; Cadilhac, Dominique A; Considine, Julie; Middleton, Sandy

    2016-11-28

    Clinical guidelines recommend that assessment and management of patients with stroke commences early including in emergency departments (ED). To inform the development of an implementation intervention targeted in ED, we conducted a systematic review of qualitative and quantitative studies to identify relevant barriers and enablers to six key clinical behaviours in acute stroke care: appropriate triage, thrombolysis administration, monitoring and management of temperature, blood glucose levels, and of swallowing difficulties and transfer of stroke patients in ED. Studies of any design, conducted in ED, where barriers or enablers based on primary data were identified for one or more of these six clinical behaviours. Major biomedical databases (CINAHL, OVID SP EMBASE, OVID SP MEDLINE) were searched using comprehensive search strategies. The barriers and enablers were categorised using the theoretical domains framework (TDF). The behaviour change technique (BCT) that best aligned to the strategy each enabler represented was selected for each of the reported enablers using a standard taxonomy. Five qualitative studies and four surveys out of the 44 studies identified met the selection criteria. The majority of barriers reported corresponded with the TDF domains of "environmental, context and resources" (such as stressful working conditions or lack of resources) and "knowledge" (such as lack of guideline awareness or familiarity). The majority of enablers corresponded with the domains of "knowledge" (such as education for physicians on the calculated risk of haemorrhage following intravenous thrombolysis [tPA]) and "skills" (such as providing opportunity to treat stroke cases of varying complexity). The total number of BCTs assigned was 18. The BCTs most frequently assigned to the reported enablers were "focus on past success" and "information about health consequences." Barriers and enablers for the delivery of key evidence-based protocols in an emergency setting have

  2. An Adaptive Temporal-Causal Network Model for Enabling Learning of Social Interaction

    NARCIS (Netherlands)

    Commu, Charlotte; Theelen, Mathilde; Treur, J.

    2017-01-01

    In this study, an adaptive temporal-causal network model is present-ed for learning of basic skills for social interaction. It focuses on greeting a known person and how that relates to learning how to recognize a person from seeing his or her face. The model involves a Hebbian learning process. The

  3. Open Knee: Open Source Modeling & Simulation to Enable Scientific Discovery and Clinical Care in Knee Biomechanics

    Science.gov (United States)

    Erdemir, Ahmet

    2016-01-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical function of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor intensive reproduction of model development steps can be avoided. The interested parties can immediately utilize readily available models for scientific discovery and for clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes detailed anatomical representation of the joint's major tissue structures, their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next generation knee models are noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age

  4. Globally COnstrained Local Function Approximation via Hierarchical Modelling, a Framework for System Modelling under Partial Information

    DEFF Research Database (Denmark)

    Øjelund, Henrik; Sadegh, Payman

    2000-01-01

    be obtained. This paper presents a new approach for system modelling under partial (global) information (or the so called Gray-box modelling) that seeks to perserve the benefits of the global as well as local methodologies sithin a unified framework. While the proposed technique relies on local approximations......Local function approximations concern fitting low order models to weighted data in neighbourhoods of the points where the approximations are desired. Despite their generality and convenience of use, local models typically suffer, among others, from difficulties arising in physical interpretation...... simultaneously with the (local estimates of) function values. The approach is applied to modelling of a linear time variant dynamic system under prior linear time invariant structure where local regression fails as a result of high dimensionality....

  5. Checking Architectural and Implementation Constraints for Domain-Specific Component Frameworks using Models

    OpenAIRE

    Noguera, Carlos; Loiret, Frédéric

    2009-01-01

    Acceptance rate: 38%; International audience; Software components are used in various application domains, and many component models and frameworks have been proposed to fulfill domain-specific requirements. The ad-hoc development of these component frameworks hampers the reuse of tools and abstractions across different frameworks. We believe that in order to promote the reuse of components within various domain contexts an homogeneous design approach is needed. A key requirement of such an a...

  6. Towards a framework for improving goal-oriented requirement models quality

    OpenAIRE

    Cares, Carlos; Franch Gutiérrez, Javier

    2009-01-01

    Goal-orientation is a widespread and useful approach to Requirements Engineering. However, quality assessment frameworks focused on goal-oriented processes are either limited or remain on the theoretical side. Requirements quality initiatives range from simple metrics applicable to requirements documents, to general-purpose quality frameworks that include syntactic, semantic and pragmatic concerns. In some recent works, we have proposed a metrics framework for goal-oriented models, b...

  7. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    Energy Technology Data Exchange (ETDEWEB)

    Gettelman, Andrew [University Corporation For Atmospheric Research (UCAR), Boulder, CO (United States)

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  8. An Access Control Model for the Uniframe Framework

    Science.gov (United States)

    2005-05-01

    is called active software capabilities framework ( ASCap ) that centers around the idea of a policy object, which instead of being embedded into the...access control component, is delivered by the client. The client firsts requests an ASCap (policy object) from the security server. Then the client...and the object server both must instantiate proxies. The client ASCap proxy may then request additional credentials from other servers which may

  9. A Framework for Modelling Trojans and Computer Virus Infection

    OpenAIRE

    Thimbleby, H.; Anderson, S.; Cairns, P.

    1998-01-01

    It is not possible to view a computer operating in the real world, including the possibility of Trojan horse programs and computer viruses, as simply a finite realisation of a Turing machine. We consider the actions of Trojan horses and viruses in real computer systems and suggest a minimal framework for an adequate formal understanding of the phenomena. Some conventional approaches, including biological metaphors, are shown to be inadequate; some suggestions are made towards constructing vir...

  10. A Variational Bayes Genomic-Enabled Prediction Model with Genotype × Environment Interaction

    Directory of Open Access Journals (Sweden)

    Osval A. Montesinos-López

    2017-06-01

    Full Text Available There are Bayesian and non-Bayesian genomic models that take into account G×E interactions. However, the computational cost of implementing Bayesian models is high, and becomes almost impossible when the number of genotypes, environments, and traits is very large, while, in non-Bayesian models, there are often important and unsolved convergence problems. The variational Bayes method is popular in machine learning, and, by approximating the probability distributions through optimization, it tends to be faster than Markov Chain Monte Carlo methods. For this reason, in this paper, we propose a new genomic variational Bayes version of the Bayesian genomic model with G×E using half-t priors on each standard deviation (SD term to guarantee highly noninformative and posterior inferences that are not sensitive to the choice of hyper-parameters. We show the complete theoretical derivation of the full conditional and the variational posterior distributions, and their implementations. We used eight experimental genomic maize and wheat data sets to illustrate the new proposed variational Bayes approximation, and compared its predictions and implementation time with a standard Bayesian genomic model with G×E. Results indicated that prediction accuracies are slightly higher in the standard Bayesian model with G×E than in its variational counterpart, but, in terms of computation time, the variational Bayes genomic model with G×E is, in general, 10 times faster than the conventional Bayesian genomic model with G×E. For this reason, the proposed model may be a useful tool for researchers who need to predict and select genotypes in several environments.

  11. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  12. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  13. A standardised graphic method for describing data privacy frameworks in primary care research using a flexible zone model.

    Science.gov (United States)

    Kuchinke, Wolfgang; Ohmann, Christian; Verheij, Robert A; van Veen, Evert-Ben; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C

    2014-12-01

    To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data privacy frameworks for research done with patient data. Based on an exploration of EU legal requirements for data protection and privacy, data access policies, and existing privacy frameworks of research projects, basic concepts and common processes were extracted, described and incorporated into a model with a formal graphical representation and a standardised notation. The Unified Modelling Language (UML) notation was enriched by workflow and own symbols to enable the representation of extended data flow requirements, data privacy and data security requirements, privacy enhancing techniques (PET) and to allow privacy threat analysis for research scenarios. Our model is built upon the concept of three privacy zones (Care Zone, Non-care Zone and Research Zone) containing databases, data transformation operators, such as data linkers and privacy filters. Using these model components, a risk gradient for moving data from a zone of high risk for patient identification to a zone of low risk can be described. The model was applied to the analysis of data flows in several general clinical research use cases and two research scenarios from the TRANSFoRm project (e.g., finding patients for clinical research and linkage of databases). The model was validated by representing research done with the NIVEL Primary Care Database in the Netherlands. The model allows analysis of data privacy and confidentiality issues for research with patient data in a structured way and provides a framework to specify a privacy compliant data flow, to communicate privacy requirements and to identify weak points for an adequate implementation of data privacy. Copyright © 2014 Elsevier Ireland Ltd. All rights

  14. Psycho-Motor and Error Enabled Simulations: Modeling Vulnerable Skills in the Pre-Mastery Phase

    Science.gov (United States)

    2016-04-01

    associated with movement initiation, ballistic action, and stabilization of movement. For all participants that attempted the laparoscopic ventral hernia...equipment placement. First, the orientation of the pelvic model was placed on the table in a fashion to mimic the positioning of a patient lying on a bed...positioning and equipment placement. First, the orientation of the pelvic model was placed on the table in a fashion to mimic the positioning of a patient

  15. Compact Ocean Models Enable Onboard AUV Autonomy and Decentralized Adaptive Sampling

    Science.gov (United States)

    2013-09-30

    ocean modeling and assimilation system that can be deployed on-board of an underwater vehicle. The developed system estimates a synoptic picture of...Award Number: N00014-10-1-0424 LONG-TERM GOALS Improve synoptic observations and enhance ocean prediction through development of new...ability of mobile agents to respond adaptively by providing them with a synoptic realization of the environment in the form of compact models of the

  16. Parametric Generation of Polygonal Tree Models for Rendering on Tessellation-Enabled Hardware

    OpenAIRE

    Nystad, Jørgen

    2010-01-01

    The main contribution of this thesis is a parametric method for generation of single-mesh polygonal tree models that follow natural rules as indicated by da Vinci in his notebooks. Following these rules allow for a relatively simple scheme of connecting branches to parent branches. Proper branch connection is a requirement for gaining the benefits of subdivision. Techniques for proper texture coordinate generation and subdivision are also explored.The result is a tree model generation scheme ...

  17. Investigating dye performance and crosstalk in fluorescence enabled bioimaging using a model system.

    Directory of Open Access Journals (Sweden)

    Riikka Arppe

    Full Text Available Detailed imaging of biological structures, often smaller than the diffraction limit, is possible in fluorescence microscopy due to the molecular size and photophysical properties of fluorescent probes. Advances in hardware and multiple providers of high-end bioimaging makes comparing images between studies and between research groups very difficult. Therefore, we suggest a model system to benchmark instrumentation, methods and staining procedures. The system we introduce is based on doped zeolites in stained polyvinyl alcohol (PVA films: a highly accessible model system which has the properties needed to act as a benchmark in bioimaging experiments. Rather than comparing molecular probes and imaging methods in complicated biological systems, we demonstrate that the model system can emulate this complexity and can be used to probe the effect of concentration, brightness, and cross-talk of fluorophores on the detected fluorescence signal. The described model system comprises of lanthanide (III ion doped Linde Type A zeolites dispersed in a PVA film stained with fluorophores. We tested: F18, MitoTracker Red and ATTO647N. This model system allowed comparing performance of the fluorophores in experimental conditions. Importantly, we here report considerable cross-talk of the dyes when exchanging excitation and emission settings. Additionally, bleaching was quantified. The proposed model makes it possible to test and benchmark staining procedures before these dyes are applied to more complex biological systems.

  18. A computer-aided framework for development, identification andmanagement of physiologically-based pharmacokinetic models

    DEFF Research Database (Denmark)

    Heitzig, Martina; Linninger, Andreas; Sin, Gürkan

    2014-01-01

    The objective of this work is the development of a generic computer-aided modelling framework to support the development of physiologically-based pharmacokinetic models thereby increasing the efficiency and quality of the modelling process. In particular, the framework systematizes the modelling......-based pharmacokinetic modelling of the distribution of the drug cyclosporin A in rats and humans. Four alternative candidate models for rats are derived and discriminated based on experimental data. The model candidate that is best represented by the experimental data is scaled-up to a human being applying...

  19. A "Rule of Five" Framework for Models and Modeling to Unify Mathematicians and Biologists and Improve Student Learning

    OpenAIRE

    Eaton, Carrie Diaz; Highlander, Hannah C.; Dahlquist, Kam D.; LaMar, M. Drew; Ledder, Glenn; Schugart, Richard C.

    2016-01-01

    Despite widespread calls for the incorporation of mathematical modeling into the undergraduate biology curriculum, there is lack of a common understanding around the definition of modeling, which inhibits progress. In this paper, we extend the "Rule of Four," initially used in calculus reform efforts, to a framework for models and modeling that is inclusive of varying disciplinary definitions of each. This unifying framework allows us to both build on strengths that each discipline and its st...

  20. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    International Nuclear Information System (INIS)

    Miller, T.

    2004-01-01

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site-scale SZ flow model, the HFM

  1. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    T. Miller

    2004-11-15

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site

  2. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    Science.gov (United States)

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  4. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    Science.gov (United States)

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation).…

  5. ENABLING “ENERGY-AWARENESS” IN THE SEMANTIC 3D CITY MODEL OF VIENNA

    Directory of Open Access Journals (Sweden)

    G. Agugiaro

    2016-09-01

    Full Text Available This paper presents and discusses the first results regarding selection, analysis, preparation and eventual integration of a number of energy-related datasets, chosen in order to enrich a CityGML-based semantic 3D city model of Vienna. CityGML is an international standard conceived specifically as information and data model for semantic city models at urban and territorial scale. The still-in-development Energy Application Domain Extension (ADE is a CityGML extension conceived to specifically model, manage and store energy-related features and attributes for buildings. The work presented in this paper is embedded within the European Marie-Curie ITN project “CINERGY, Smart cities with sustainable energy systems”, which aims, among the rest, at developing urban decision making and operational optimisation software tools to minimise non-renewable energy use in cities. Given the scope and scale of the project, it is therefore vital to set up a common, unique and spatio-semantically coherent urban data model to be used as information hub for all applications being developed. This paper reports about the experiences done so far, it describes the test area in Vienna, Austria, and the available data sources, it shows and exemplifies the main data integration issues, the strategies developed to solve them in order to obtain the enriched 3D city model. The first results as well as some comments about their quality and limitations are presented, together with the discussion regarding the next steps and some planned improvements.

  6. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    Directory of Open Access Journals (Sweden)

    Paweł Sitek

    2016-01-01

    Full Text Available This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs and constraint optimization problems (COPs. Two paradigms, CLP (constraint logic programming and MP (mathematical programming, are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework automatically generates CSP and COP models based on current values of data instances, questions asked by a user, and set of predicates and facts of the problem being modeled, which altogether constitute a knowledge database for the given problem. This dynamic generation of dedicated models, based on the knowledge base, together with the parameters changing externally, for example, the user’s questions, is the implementation of the autonomous search concept. The models are solved using the internal or external solvers integrated with the framework. The architecture of the framework as well as its implementation outline is also included in the paper. The effectiveness of the framework regarding the modeling and solution search is assessed through the illustrative examples relating to scheduling problems with additional constrained resources.

  7. An integrated end-to-end modeling framework for testing ecosystem-wide effects of human-induced pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Nielsen, J. Rasmus; Christensen, Asbjørn

    to the high-resolution coupled physical-biological model HBM-ERGOM and the fisheries bio-economic FishRent model. We investigate ecosystem-wide responses to changes in human-induced pressures by simulating several eutrophication scenarios that are relevant to existing Baltic Sea management plans (e.g. EU BSAP......, EU CFP). We further present the structure and calibration of the Baltic ATLANTIS model and the operational linkage to the other models. Using the results of eutrophication scenarios, and focusing on the relative changes in fish and fishery production, we discuss the robustness of the model linking......We present an integrated end-to-end modeling framework that enables whole-of ecosystem climate, eutrophication, and spatial management scenario exploration in the Baltic Sea. The framework is built around the Baltic implementation of the spatially-explicit end-to-end ATLANTIS model, linked...

  8. Model-based visual tracking the OpenTL framework

    CERN Document Server

    Panin, Giorgio

    2011-01-01

    This book has two main goals: to provide a unifed and structured overview of this growing field, as well as to propose a corresponding software framework, the OpenTL library, developed by the author and his working group at TUM-Informatik. The main objective of this work is to show, how most real-world application scenarios can be naturally cast into a common description vocabulary, and therefore implemented and tested in a fully modular and scalable way, through the defnition of a layered, object-oriented software architecture.The resulting architecture covers in a seamless way all processin

  9. Modeling, Simulation, and Analysis of a Decoy State Enabled Quantum Key Distribution System

    Science.gov (United States)

    2015-03-26

    Protecting Information, New York: Cambridge University Press, 2006. [6] M. A. Nielsen and I. L. Chuang, Quantum Computation and Quantum Information...configurable to interfere with Bob’s ability to detect a weak coherent pulse. DR D 5 The QKD model shall be accurate, flexible, usable , extensible

  10. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    Science.gov (United States)

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  11. Neonatal tolerance induction enables accurate evaluation of gene therapy for MPS I in a canine model.

    Science.gov (United States)

    Hinderer, Christian; Bell, Peter; Louboutin, Jean-Pierre; Katz, Nathan; Zhu, Yanqing; Lin, Gloria; Choa, Ruth; Bagel, Jessica; O'Donnell, Patricia; Fitzgerald, Caitlin A; Langan, Therese; Wang, Ping; Casal, Margret L; Haskins, Mark E; Wilson, James M

    2016-09-01

    High fidelity animal models of human disease are essential for preclinical evaluation of novel gene and protein therapeutics. However, these studies can be complicated by exaggerated immune responses against the human transgene. Here we demonstrate that dogs with a genetic deficiency of the enzyme α-l-iduronidase (IDUA), a model of the lysosomal storage disease mucopolysaccharidosis type I (MPS I), can be rendered immunologically tolerant to human IDUA through neonatal exposure to the enzyme. Using MPS I dogs tolerized to human IDUA as neonates, we evaluated intrathecal delivery of an adeno-associated virus serotype 9 vector expressing human IDUA as a therapy for the central nervous system manifestations of MPS I. These studies established the efficacy of the human vector in the canine model, and allowed for estimation of the minimum effective dose, providing key information for the design of first-in-human trials. This approach can facilitate evaluation of human therapeutics in relevant animal models, and may also have clinical applications for the prevention of immune responses to gene and protein replacement therapies. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Developmental Impact Analysis of an ICT-Enabled Scalable Healthcare Model in BRICS Economies

    Directory of Open Access Journals (Sweden)

    Dhrubes Biswas

    2012-06-01

    Full Text Available This article highlights the need for initiating a healthcare business model in a grassroots, emerging-nation context. This article’s backdrop is a history of chronic anomalies afflicting the healthcare sector in India and similarly placed BRICS nations. In these countries, a significant percentage of populations remain deprived of basic healthcare facilities and emergency services. Community (primary care services are being offered by public and private stakeholders as a panacea to the problem. Yet, there is an urgent need for specialized (tertiary care services at all levels. As a response to this challenge, an all-inclusive health-exchange system (HES model, which utilizes information communication technology (ICT to provide solutions in rural India, has been developed. The uniqueness of the model lies in its innovative hub-and-spoke architecture and its emphasis on affordability, accessibility, and availability to the masses. This article describes a developmental impact analysis (DIA that was used to assess the impact of this model. The article contributes to the knowledge base of readers by making them aware of the healthcare challenges emerging nations are facing and ways to mitigate those challenges using entrepreneurial solutions.

  13. Integrating semantics and procedural generation: key enabling factors for declarative modeling of virtual worlds

    NARCIS (Netherlands)

    Bidarra, R.; Kraker, K.J. de; Smelik, R.M.; Tutenel, T.

    2010-01-01

    Manual content creation for virtual worlds can no longer satisfy the increasing demand arising from areas as entertainment and serious games, simulations, movies, etc. Furthermore, currently deployed modeling tools basically do not scale up: while they become more and more specialized and complex,

  14. Modeling orbital relative motion to enable formation design from application requirements

    Science.gov (United States)

    Fasano, Giancarmine; D'Errico, Marco

    2009-11-01

    While trajectory design for single satellite Earth observation missions is usually performed by means of analytical and relatively simple models of orbital dynamics including the main perturbations for the considered cases, most literature on formation flying dynamics is devoted to control issues rather than mission design. This work aims at bridging the gap between mission requirements and relative dynamics in multi-platform missions by means of an analytical model that describes relative motion for satellites moving on near circular low Earth orbits. The development is based on the orbital parameters approach and both the cases of close and large formations are taken into account. Secular Earth oblateness effects are included in the derivation. Modeling accuracy, when compared to a nonlinear model with two body and J2 forces, is shown to be of the order of 0.1% of relative coordinates for timescales of hundreds of orbits. An example of formation design is briefly described shaping a two-satellite formation on the basis of geometric requirements for synthetic aperture radar interferometry.

  15. eTOXlab, an open source modeling framework for implementing predictive models in production environments.

    Science.gov (United States)

    Carrió, Pau; López, Oriol; Sanz, Ferran; Pastor, Manuel

    2015-01-01

    Computational models based in Quantitative-Structure Activity Relationship (QSAR) methodologies are widely used tools for predicting the biological properties of new compounds. In many instances, such models are used as a routine in the industry (e.g. food, cosmetic or pharmaceutical industry) for the early assessment of the biological properties of new compounds. However, most of the tools currently available for developing QSAR models are not well suited for supporting the whole QSAR model life cycle in production environments. We have developed eTOXlab; an open source modeling framework designed to be used at the core of a self-contained virtual machine that can be easily deployed in production environments, providing predictions as web services. eTOXlab consists on a collection of object-oriented Python modules with methods mapping common tasks of standard modeling workflows. This framework allows building and validating QSAR models as well as predicting the properties of new compounds using either a command line interface or a graphic user interface (GUI). Simple models can be easily generated by setting a few parameters, while more complex models can be implemented by overriding pieces of the original source code. eTOXlab benefits from the object-oriented capabilities of Python for providing high flexibility: any model implemented using eTOXlab inherits the features implemented in the parent model, like common tools and services or the automatic exposure of the models as prediction web services. The particular eTOXlab architecture as a self-contained, portable prediction engine allows building models with confidential information within corporate facilities, which can be safely exported and used for prediction without disclosing the structures of the training series. The software presented here provides full support to the specific needs of users that want to develop, use and maintain predictive models in corporate environments. The technologies used by e

  16. A controlled human malaria infection model enabling evaluation of transmission-blocking interventions.

    Science.gov (United States)

    Collins, Katharine A; Wang, Claire Yt; Adams, Matthew; Mitchell, Hayley; Rampton, Melanie; Elliott, Suzanne; Reuling, Isaie J; Bousema, Teun; Sauerwein, Robert; Chalon, Stephan; Möhrle, Jörg J; McCarthy, James S

    2018-03-12

    Drugs and vaccines that can interrupt the transmission of Plasmodium falciparum will be important for malaria control and elimination. However, models for early clinical evaluation of candidate transmission-blocking interventions are currently unavailable. Here, we describe a new model for evaluating malaria transmission from humans to Anopheles mosquitoes using controlled human malaria infection (CHMI). Seventeen healthy malaria-naive volunteers underwent CHMI by intravenous inoculation of P. falciparum-infected erythrocytes to initiate blood-stage infection. Seven to eight days after inoculation, participants received piperaquine (480 mg) to attenuate asexual parasite replication while allowing gametocytes to develop and mature. Primary end points were development of gametocytemia, the transmissibility of gametocytes from humans to mosquitoes, and the safety and tolerability of the CHMI transmission model. To investigate in vivo gametocytocidal drug activity in this model, participants were either given an experimental antimalarial, artefenomel (500 mg), or a known gametocytocidal drug, primaquine (15 mg), or remained untreated during the period of gametocyte carriage. Male and female gametocytes were detected in all participants, and transmission to mosquitoes was achieved from 8 of 11 (73%) participants evaluated. Compared with results in untreated controls (n = 7), primaquine (15 mg, n = 5) significantly reduced gametocyte burden (P = 0.01), while artefenomel (500 mg, n = 4) had no effect. Adverse events (AEs) were mostly mild or moderate. Three AEs were assessed as severe - fatigue, elevated alanine aminotransferase, and elevated aspartate aminotransferase - and were attributed to malaria infection. Transaminase elevations were transient, asymptomatic, and resolved without intervention. We report the safe and reproducible induction of P. falciparum gametocytes in healthy malaria-naive volunteers at densities infectious to mosquitoes, thereby demonstrating the

  17. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    Science.gov (United States)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  18. Modelling the dynamics of an experimental host-pathogen microcosm within a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    David Lunn

    Full Text Available The advantages of Bayesian statistical approaches, such as flexibility and the ability to acknowledge uncertainty in all parameters, have made them the prevailing method for analysing the spread of infectious diseases in human or animal populations. We introduce a Bayesian approach to experimental host-pathogen systems that shares these attractive features. Since uncertainty in all parameters is acknowledged, existing information can be accounted for through prior distributions, rather than through fixing some parameter values. The non-linear dynamics, multi-factorial design, multiple measurements of responses over time and sampling error that are typical features of experimental host-pathogen systems can also be naturally incorporated. We analyse the dynamics of the free-living protozoan Paramecium caudatum and its specialist bacterial parasite Holospora undulata. Our analysis provides strong evidence for a saturable infection function, and we were able to reproduce the two waves of infection apparent in the data by separating the initial inoculum from the parasites released after the first cycle of infection. In addition, the parameter estimates from the hierarchical model can be combined to infer variations in the parasite's basic reproductive ratio across experimental groups, enabling us to make predictions about the effect of resources and host genotype on the ability of the parasite to spread. Even though the high level of variability between replicates limited the resolution of the results, this Bayesian framework has strong potential to be used more widely in experimental ecology.

  19. A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment.

    Science.gov (United States)

    Kibria, Muhammad Golam; Fattah, Sheik Mohammad Mostakim; Jeong, Kwanghyeon; Chong, Ilyoung; Jeong, Youn-Kwae

    2015-09-18

    User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service.

  20. A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment

    Science.gov (United States)

    Kibria, Muhammad Golam; Fattah, Sheik Mohammad Mostakim; Jeong, Kwanghyeon; Chong, Ilyoung; Jeong, Youn-Kwae

    2015-01-01

    User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service. PMID:26393609

  1. A model independent S/W framework for search-based software testing.

    Science.gov (United States)

    Oh, Jungsup; Baik, Jongmoon; Lim, Sung-Hwa

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model.

  2. A probabilistic generative model for quantification of DNA modifications enables analysis of demethylation pathways.

    Science.gov (United States)

    Äijö, Tarmo; Huang, Yun; Mannerström, Henrik; Chavez, Lukas; Tsagaratou, Ageliki; Rao, Anjana; Lähdesmäki, Harri

    2016-03-14

    We present a generative model, Lux, to quantify DNA methylation modifications from any combination of bisulfite sequencing approaches, including reduced, oxidative, TET-assisted, chemical-modification assisted, and methylase-assisted bisulfite sequencing data. Lux models all cytosine modifications (C, 5mC, 5hmC, 5fC, and 5caC) simultaneously together with experimental parameters, including bisulfite conversion and oxidation efficiencies, as well as various chemical labeling and protection steps. We show that Lux improves the quantification and comparison of cytosine modification levels and that Lux can process any oxidized methylcytosine sequencing data sets to quantify all cytosine modifications. Analysis of targeted data from Tet2-knockdown embryonic stem cells and T cells during development demonstrates DNA modification quantification at unprecedented detail, quantifies active demethylation pathways and reveals 5hmC localization in putative regulatory regions.

  3. Efficiency-centered, innovation-enabling business models of high tech SMEs: evidence from Hong Kong

    OpenAIRE

    Loon, M; Chik, R

    2017-01-01

    High technology small and medium-sized enterprises are compelled to innovate to differentiate themselves from their competitors but at the same time be efficient, as they do not have economies of scale enjoyed by larger organizations. This qualitative study explores this paradoxical challenge faced by Hong Kong SMEs in designing their business model to strike such a balance. In doing so, it investigates the competencies of these firms in technology management and their innovation practices. I...

  4. Predicting lymphatic filariasis transmission and elimination dynamics using a multi-model ensemble framework

    NARCIS (Netherlands)

    Smith, M.E. (Morgan E.); B.K. Singh (Brajendra K.); M.A. Irvine (Michael A.); W.A. Stolk (Wilma); S.V. Subramanian; T.D. Hollingsworth (T. Déirdre); Michael, E. (Edwin)

    2017-01-01

    textabstractMathematical models of parasite transmission provide powerful tools for assessing the impacts of interventions. Owing to complexity and uncertainty, no single model may capture all features of transmission and elimination dynamics. Multi-model ensemble modelling offers a framework to

  5. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad

    2017-05-02

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  6. An integrated framework for modeling freight mode and route choice.

    Science.gov (United States)

    2013-10-01

    A number of statewide travel demand models have included freight as a separate component in analysis. Unlike : passenger travel, freight has not gained equivalent attention because of lack of data and difficulties in modeling. In : the current state ...

  7. A model-based framework for the analysis of team communication in nuclear power plants

    International Nuclear Information System (INIS)

    Chung, Yun Hyung; Yoon, Wan Chul; Min, Daihwan

    2009-01-01

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants

  8. SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, M.; Ding, P.; Aliaga, L.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.

    2016-10-10

    The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.

  9. UAV-enabled reconnaissance and trajectory modeling of a co-seismic rockfall in Lefkada

    OpenAIRE

    Saroglou, Charalampos; Asteriou, Pavlos; Zekkos, Dimitris; Tsiambaos, George; Clark, Marin; Manousakis, John

    2017-01-01

    The paper presents the field evidence and the kinematical study of the motion of a rock block mobilised by an earthquake-induced rockfall in Ponti area in the island of Lefkada during a Mw 6.5 earthquake on 17th November 2015. A detailed field survey was deployed using an Unmanned Aerial Vehicle (UAV) with an ultra-high definition (UHD) camera, which produced a high-resolution orthophoto and a Digital Surface Model (DSM) of the terrain. The sequence of impact marks from the rock trajectory on...

  10. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    Science.gov (United States)

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  11. Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations

    Science.gov (United States)

    Ehlert, Kurt; Loewe, Laurence

    2014-11-01

    To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected "hubs" such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution. Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present "Lazy Updating," an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise.

  12. A reference model and technical framework for mobile social software for learning

    NARCIS (Netherlands)

    De Jong, Tim; Specht, Marcus; Koper, Rob

    2008-01-01

    De Jong, T., Specht, M., & Koper, R. (2008). A reference model and technical framework for mobile social software for learning. Presented at the IADIS m-learning 2008 Conference. April, 11-13, 2008, Carvoeiro, Portugal.

  13. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  14. Comparing droplet activation parameterisations against adiabatic parcel models using a novel inverse modelling framework

    Science.gov (United States)

    Partridge, Daniel; Morales, Ricardo; Stier, Philip

    2015-04-01

    Many previous studies have compared droplet activation parameterisations against adiabatic parcel models (e.g. Ghan et al., 2001). However, these have often involved comparisons for a limited number of parameter combinations based upon certain aerosol regimes. Recent studies (Morales et al., 2014) have used wider ranges when evaluating their parameterisations, however, no study has explored the full possible multi-dimensional parameter space that would be experienced by droplet activations within a global climate model (GCM). It is important to be able to efficiently highlight regions of the entire multi-dimensional parameter space in which we can expect the largest discrepancy between parameterisation and cloud parcel models in order to ascertain which regions simulated by a GCM can be expected to be a less accurate representation of the process of cloud droplet activation. This study provides a new, efficient, inverse modelling framework for comparing droplet activation parameterisations to more complex cloud parcel models. To achieve this we couple a Markov Chain Monte Carlo algorithm (Partridge et al., 2012) to two independent adiabatic cloud parcel models and four droplet activation parameterisations. This framework is computationally faster than employing a brute force Monte Carlo simulation, and allows us to transparently highlight which parameterisation provides the closest representation across all aerosol physiochemical and meteorological environments. The parameterisations are demonstrated to perform well for a large proportion of possible parameter combinations, however, for certain key parameters; most notably the vertical velocity and accumulation mode aerosol concentration, large discrepancies are highlighted. These discrepancies correspond for parameter combinations that result in very high/low simulated values of maximum supersaturation. By identifying parameter interactions or regimes within the multi-dimensional parameter space we hope to guide

  15. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    Science.gov (United States)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; Houze, Robert A.; Xiao, Heng

    2018-02-01

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii) the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. In addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.

  16. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  17. Modular degradable dendrimers enable small RNAs to extend survival in an aggressive liver cancer model.

    Science.gov (United States)

    Zhou, Kejin; Nguyen, Liem H; Miller, Jason B; Yan, Yunfeng; Kos, Petra; Xiong, Hu; Li, Lin; Hao, Jing; Minnig, Jonathan T; Zhu, Hao; Siegwart, Daniel J

    2016-01-19

    RNA-based cancer therapies are hindered by the lack of delivery vehicles that avoid cancer-induced organ dysfunction, which exacerbates carrier toxicity. We address this issue by reporting modular degradable dendrimers that achieve the required combination of high potency to tumors and low hepatotoxicity to provide a pronounced survival benefit in an aggressive genetic cancer model. More than 1,500 dendrimers were synthesized using sequential, orthogonal reactions where ester degradability was systematically integrated with chemically diversified cores, peripheries, and generations. A lead dendrimer, 5A2-SC8, provided a broad therapeutic window: identified as potent [EC50 75 mg/kg dendrimer repeated dosing). Delivery of let-7 g microRNA (miRNA) mimic inhibited tumor growth and dramatically extended survival. Efficacy stemmed from a combination of a small RNA with the dendrimer's own negligible toxicity, therefore illuminating an underappreciated complication in treating cancer with RNA-based drugs.

  18. Porcine familial adenomatous polyposis model enables systematic analysis of early events in adenoma progression.

    Science.gov (United States)

    Flisikowska, Tatiana; Stachowiak, Monika; Xu, Hongen; Wagner, Alexandra; Hernandez-Caceres, Alejandra; Wurmser, Christine; Perleberg, Carolin; Pausch, Hubert; Perkowska, Anna; Fischer, Konrad; Frishman, Dmitrij; Fries, Ruedi; Switonski, Marek; Kind, Alexander; Saur, Dieter; Schnieke, Angelika; Flisikowski, Krzysztof

    2017-07-26

    We compared gene expression in low and high-grade intraepithelial dysplastic polyps from pigs carrying an APC 1311 truncating mutation orthologous to human APC 1309 , analysing whole samples and microdissected dysplastic epithelium. Gene set enrichment analysis revealed differential expression of gene sets similar to human normal mucosa versus T1 stage polyps. Transcriptome analysis of whole samples revealed many differentially-expressed genes reflecting immune infiltration. Analysis of microdissected dysplastic epithelium was markedly different and showed increased expression in high-grade intraepithelial neoplasia of several genes known to be involved in human CRC; and revealed possible new roles for GBP6 and PLXND1. The pig model thus facilitates analysis of CRC pathogenesis.

  19. Enabling Persistent Autonomy for Underwater Gliders with Ocean Model Predictions and Terrain Based Navigation

    Directory of Open Access Journals (Sweden)

    Andrew eStuntz

    2016-04-01

    Full Text Available Effective study of ocean processes requires sampling over the duration of long (weeks to months oscillation patterns. Such sampling requires persistent, autonomous underwater vehicles, that have a similarly long deployment duration. The spatiotemporal dynamics of the ocean environment, coupled with limited communication capabilities, make navigation and localization difficult, especially in coastal regions where the majority of interesting phenomena occur. In this paper, we consider the combination of two methods for reducing navigation and localization error; a predictive approach based on ocean model predictions and a prior information approach derived from terrain-based navigation. The motivation for this work is not only for real-time state estimation, but also for accurately reconstructing the actual path that the vehicle traversed to contextualize the gathered data, with respect to the science question at hand. We present an application for the practical use of priors and predictions for large-scale ocean sampling. This combined approach builds upon previous works by the authors, and accurately localizes the traversed path of an underwater glider over long-duration, ocean deployments. The proposed method takes advantage of the reliable, short-term predictions of an ocean model, and the utility of priors used in terrain-based navigation over areas of significant bathymetric relief to bound uncertainty error in dead-reckoning navigation. This method improves upon our previously published works by 1 demonstrating the utility of our terrain-based navigation method with multiple field trials, and 2 presenting a hybrid algorithm that combines both approaches to bound navigational error and uncertainty for long-term deployments of underwater vehicles. We demonstrate the approach by examining data from actual field trials with autonomous underwater gliders, and demonstrate an ability to estimate geographical location of an underwater glider to 2

  20. National culture and business model change: a framework for successful expansions

    DEFF Research Database (Denmark)

    Dalby, J.; Nielsen, L.S.; Lueg, Rainer

    2014-01-01

    Dalby, J., Nielsen, Lueg, R., L. S., Pedersen, L., Tomoni, A. C. 2014. National culture and business model change: a framework for successful expansions. Journal of Enterprising Culture, 22(4): 379-498.......Dalby, J., Nielsen, Lueg, R., L. S., Pedersen, L., Tomoni, A. C. 2014. National culture and business model change: a framework for successful expansions. Journal of Enterprising Culture, 22(4): 379-498....

  1. Pilot project as enabler?

    DEFF Research Database (Denmark)

    Neisig, Margit; Glimø, Helle; Holm, Catrine Granzow

    This article deals with a systemic perspective on transition. The field of study addressed is a pilot project as enabler of transition in a highly complex polycentric context. From a Luhmannian systemic approach, a framework is created to understand and address barriers of change occurred using...... pilot projects as enabler of transition. Aspects of how to create trust and deal with distrust during a transition are addressed. The transition in focus is the concept of New Public Management and how it is applied in the management of the Employment Service in Denmark. The transition regards...

  2. Poly(ethylene glycol) (PEG) in a Polyethylene (PE) Framework: A Simple Model for Simulation Studies of a Soluble Polymer in an Open Framework.

    Science.gov (United States)

    Xie, Liangxu; Chan, Kwong-Yu; Quirke, Nick

    2017-10-24

    Canonical molecular dynamics simulations are performed to investigate the behavior of single-chain and multiple-chain poly(ethylene glycol) (PEG) contained within a cubic framework spanned by polyethylene (PE) chains. This simple model is the first of its kind to study the chemical physics of polymer-threaded organic frameworks, which are materials with potential applications in catalysis and separation processes. For a single-chain 9-mer, 14-mer, and 18-mer in a small framework, the PEG will interact strongly with the framework and assume a more linear geometry chain with an increased radius of gyration R g compared to that of a large framework. The interaction between PEG and the framework decreases with increasing mesh size in both vacuum and water. In the limit of a framework with an infinitely large cavity (infinitely long linkers), PEG behavior approaches simulation results without a framework. The solvation of PEG is simulated by adding explicit TIP3P water molecules to a 6-chain PEG 14-mer aggregate confined in a framework. The 14-mer chains are readily solvated and leach out of a large 2.6 nm mesh framework. There are fewer water-PEG interactions in a small 1.0 nm mesh framework, as indicated by a smaller number of hydrogen bonds. The PEG aggregate, however, still partially dissolves but is retained within the 1.0 nm framework. The preliminary results illustrate the effectiveness of the simple model in studying polymer-threaded framework materials and in optimizing polymer or framework parameters for high performance.

  3. Topological models and frameworks for 3D spatial objects

    Science.gov (United States)

    Zlatanova, Siyka; Rahman, Alias Abdul; Shi, Wenzhong

    2004-05-01

    Topology is one of the mechanisms to describe relationships between spatial objects. Thus, it is the basis for many spatial operations. Models utilizing the topological properties of spatial objects are usually called topological models, and are considered by many researchers as the best suited for complex spatial analysis (i.e., the shortest path search). A number of topological models for two-dimensional and 2.5D spatial objects have been implemented (or are under consideration) by GIS and DBMS vendors. However, when we move to one more dimension (i.e., three-dimensions), the complexity of the relationships increases, and this requires new approaches, rules and representations. This paper aims to give an overview of the 3D topological models presented in the literature, and to discuss generic issues related to 3D modeling. The paper also considers models in object-oriented (OO) environments. Finally, future trends for research and development in this area are highlighted.

  4. Scoping review identifies significant number of knowledge translation theories, models and frameworks with limited use.

    Science.gov (United States)

    Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E

    2018-04-13

    To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.

  5. Introducing MERGANSER: A Flexible Framework for Ecological Niche Modeling

    Science.gov (United States)

    Klawonn, M.; Dow, E. M.

    2015-12-01

    Ecological Niche Modeling (ENM) is a collection of techniques to find a "fundamental niche", the range of environmental conditions suitable for a species' survival in the absence of inter-species interactions, given a set of environmental parameters. Traditional approaches to ENM face a number of obstacles including limited data accessibility, data management problems, computational costs, interface usability, and model validation. The MERGANSER system, which stands for Modeling Ecological Residency Given A Normalized Set of Environmental Records, addresses these issues through powerful data persistence and flexible data access, coupled with a clear presentation of results and fine-tuned control over model parameters. MERGANSER leverages data measuring 72 weather related phenomena, land cover, soil type, population, species occurrence, general species information, and elevation, totaling over 1.5 TB of data. To the best of the authors' knowledge, MERGANSER uses higher-resolution spatial data sets than previously published models. Since MERGANSER stores data in an instance of Apache SOLR, layers generated in support of niche models are accessible to users via simplified Apache Lucene queries. This is made even simpler via an HTTP front end that generates Lucene queries automatically. Specifically, a user need only enter the name of a place and a species to run a model. Using this approach to synthesizing model layers, the MERGANSER system has successfully reproduced previously published niche model results with a simplified user experience. Input layers for the model are generated dynamically using OpenStreetMap and SOLR's spatial search functionality. Models are then run using either user-specified or automatically determined parameters after normalizing them into a common grid. Finally, results are visualized in the web interface, which allows for quick validation. Model results and all surrounding metadata are also accessible to the user for further study.

  6. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.

    2007-01-01

    To be able to build a secure network, it is essential to model the threats to the network. A methodology for building a threat model has been proposed in the paper. Several existing threat models and methodologies will be compared to the proposed methodology. The aim of the proposed methodology i...... been used. Also risk assessment methods will be discussed. Threat profiles and vulnerability profiles have been presented....

  7. Cross-Layer Modeling Framework for Energy-Efficient Resilience

    Science.gov (United States)

    2014-04-01

    Kevin Skadron##, Gu-Yeon Wei+ * IBM T. J. Watson Research Center, Yorktown Heights, NY ** IBM Austin Research Laboratory, Austin, TX +Dept. of...Qute model developed at IBM Research [3]. The first two are both developed around basic analytical formalisms based on Amdahl’s Law. Qute is an...Modeling Strategy Figure 1 depicts the integrated, cross-layer system modeling concept as pursued in the IBM -led project titled: “Efficient

  8. Modeling ductal carcinoma in situ: a HER2-Notch3 collaboration enables luminal filling.

    LENUS (Irish Health Repository)

    Pradeep, C-R

    2012-02-16

    A large fraction of ductal carcinoma in situ (DCIS), a non-invasive precursor lesion of invasive breast cancer, overexpresses the HER2\\/neu oncogene. The ducts of DCIS are abnormally filled with cells that evade apoptosis, but the underlying mechanisms remain incompletely understood. We overexpressed HER2 in mammary epithelial cells and observed growth factor-independent proliferation. When grown in extracellular matrix as three-dimensional spheroids, control cells developed a hollow lumen, but HER2-overexpressing cells populated the lumen by evading apoptosis. We demonstrate that HER2 overexpression in this cellular model of DCIS drives transcriptional upregulation of multiple components of the Notch survival pathway. Importantly, luminal filling required upregulation of a signaling pathway comprising Notch3, its cleaved intracellular domain and the transcriptional regulator HES1, resulting in elevated levels of c-MYC and cyclin D1. In line with HER2-Notch3 collaboration, drugs intercepting either arm reverted the DCIS-like phenotype. In addition, we report upregulation of Notch3 in hyperplastic lesions of HER2 transgenic animals, as well as an association between HER2 levels and expression levels of components of the Notch pathway in tumor specimens of breast cancer patients. Therefore, it is conceivable that the integration of the Notch and HER2 signaling pathways contributes to the pathophysiology of DCIS.

  9. A suite of R packages for web-enabled modeling and analysis of surface waters

    Science.gov (United States)

    Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.

    2014-12-01

    Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.

  10. Deep Modeling: Circuit Characterization Using Theory Based Models in a Data Driven Framework

    Energy Technology Data Exchange (ETDEWEB)

    Bolme, David S [ORNL; Mikkilineni, Aravind K [ORNL; Rose, Derek C [ORNL; Yoginath, Srikanth B [ORNL; Holleman, Jeremy [University of Tennessee, Knoxville (UTK); Judy, Mohsen [University of Tennessee, Knoxville (UTK), Department of Electrical Engineering and Computer Science

    2017-01-01

    Analog computational circuits have been demonstrated to provide substantial improvements in power and speed relative to digital circuits, especially for applications requiring extreme parallelism but only modest precision. Deep machine learning is one such area and stands to benefit greatly from analog and mixed-signal implementations. However, even at modest precisions, offsets and non-linearity can degrade system performance. Furthermore, in all but the simplest systems, it is impossible to directly measure the intermediate outputs of all sub-circuits. The result is that circuit designers are unable to accurately evaluate the non-idealities of computational circuits in-situ and are therefore unable to fully utilize measurement results to improve future designs. In this paper we present a technique to use deep learning frameworks to model physical systems. Recently developed libraries like TensorFlow make it possible to use back propagation to learn parameters in the context of modeling circuit behavior. Offsets and scaling errors can be discovered even for sub-circuits that are deeply embedded in a computational system and not directly observable. The learned parameters can be used to refine simulation methods or to identify appropriate compensation strategies. We demonstrate the framework using a mixed-signal convolution operator as an example circuit.

  11. A New Perspective for Modeling Power Electronics Converters : Complementarity Framework

    NARCIS (Netherlands)

    Vasca, Francesco; Iannelli, Luigi; Camlibel, M. Kanat; Frasca, Roberto

    2009-01-01

    The switching behavior of power converters with "ideal" electronic devices (EDs) makes it difficult to define a switched model that describes the dynamics of the converter in all possible operating conditions, i.e., a "complete" model. Indeed, simplifying assumptions on the sequences of modes are

  12. Abdominal surgery process modeling framework for simulation using spreadsheets.

    Science.gov (United States)

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Toward the Establishment of a Common Framework for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1996-01-01

    Proceedings of the Twenty-first NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held November 6-10 1995, in Baltimore, Maryland.......Proceedings of the Twenty-first NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held November 6-10 1995, in Baltimore, Maryland....

  14. Biochemical Space: A Framework for Systemic Annotation of Biological Models

    Czech Academy of Sciences Publication Activity Database

    Klement, M.; Děd, T.; Šafránek, D.; Červený, Jan; Müller, Stefan; Steuer, Ralf

    2014-01-01

    Roč. 306, JUL (2014), s. 31-44 ISSN 1571-0661 R&D Projects: GA MŠk(CZ) EE2.3.20.0256 Institutional support: RVO:67179843 Keywords : biological models * model annotation * systems biology * cyanobacteria Subject RIV: EH - Ecology, Behaviour

  15. A business model for IPTV service: A dynamic framework

    NARCIS (Netherlands)

    Bouwman, H.; Zhengjia, M.; Duin, P. van der; Limonard, S.

    2008-01-01

    Purpose - The purpose of this paper is to investigate a possible business model for telecom operators for entering the IPTV (digital television) market. Design/methodology/approach - The approach takes the form of a case study, literature search and interviews. Findings - The IPTV business model

  16. Integrating environmental component models. Development of a software framework

    NARCIS (Netherlands)

    Schmitz, O.

    2014-01-01

    Integrated models consist of interacting component models that represent various natural and social systems. They are important tools to improve our understanding of environmental systems, to evaluate cause–effect relationships of human–natural interactions, and to forecast the behaviour of

  17. Public–private partnership conceptual framework and models for the ...

    African Journals Online (AJOL)

    (2012c) Project to Revise the Pricing Strategy for Water Use. Charges and Develop a Funding Model for Water Infrastructure. Development and Use and a Model for the Establishment of an. Economic Regulator (Contract No. WP10465). Review of Principles and Experience for Infrastructure Finance. Department of Water.

  18. Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments

    Science.gov (United States)

    2016-01-01

    This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system. PMID:26926691

  19. Interaction between GIS and hydrologic model: A preliminary approach using ArcHydro Framework Data Model

    Directory of Open Access Journals (Sweden)

    Silvio Jorge C. Simões

    2013-08-01

    Full Text Available In different regions of Brazil, population growth and economic development can degrade water quality, compromising watershed health and human supply. Because of its ability to combine spatial and temporal data in the same environment and to create water resources management (WRM models, the Geographical Information System (GIS is a powerful tool for managing water resources, preventing floods and estimating water supply. This paper discusses the integration between GIS and hydrological models and presents a case study relating to the upper section of the Paraíba do Sul Basin (Sao Paulo State portion, situated in the Southeast of Brazil. The case study presented in this paper has a database suitable for the basin’s dimensions, including digitized topographic maps at a 50,000 scale. From an ArcGIS®/ArcHydro Framework Data Model, a geometric network was created to produce different raster products. This first grid derived from the digital elevation model grid (DEM is the flow direction map followed by flow accumulation, stream and catchment maps. The next steps in this research are to include the different multipurpose reservoirs situated along the Paraíba do Sul River and to incorporate rainfall time series data in ArcHydro to build a hydrologic data model within a GIS environment in order to produce a comprehensive spatial temporal model.

  20. A Framework for Modeling Human-Machine Interactions

    Science.gov (United States)

    Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.

  1. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  2. A Generalized Framework for Modeling Next Generation 911 Implementations.

    Energy Technology Data Exchange (ETDEWEB)

    Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka; Jrad, Ahmad M.; Mitchell, Roger

    2018-02-01

    This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We found that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .

  3. A General Framework for Incorporating Stochastic Recovery in Structural Models of Credit Risk

    Directory of Open Access Journals (Sweden)

    Albert Cohen

    2017-12-01

    Full Text Available In this work, we introduce a general framework for incorporating stochastic recovery into structural models. The framework extends the approach to recovery modeling developed in Cohen and Costanzino (2015, 2017 and provides for a systematic way to include different recovery processes into a structural credit model. The key observation is a connection between the partial information gap between firm manager and the market that is captured via a distortion of the probability of default. This last feature is computed by what is essentially a Girsanov transformation and reflects untangling of the recovery process from the default probability. Our framework can be thought of as an extension of Ishizaka and Takaoka (2003 and, in the same spirit of their work, we provide several examples of the framework including bounded recovery and a jump-to-zero model. One of the nice features of our framework is that, given prices from any one-factor structural model, we provide a systematic way to compute corresponding prices with stochastic recovery. The framework also provides a way to analyze correlation between Probability of Default (PD and Loss Given Default (LGD, and term structure of recovery rates.

  4. Extending the Modelling Framework for Gas-Particle Systems

    DEFF Research Database (Denmark)

    Rosendahl, Lasse Aistrup

    , with very good results. Single particle combustion has been tested using a number of different particle combustion models applied to coal and straw particles. Comparing the results of these calculations to measurements on straw burnout, the results indicate that for straw, existing heterogeneous combustion...... models perform well, and may be used in high temperature ranges. Finally, the particle tracking and combustion model is applied to an existing coal and straw co- fuelled burner. The results indicate that again, the straw follows very different trajectories than the coal particles, and also that burnout...

  5. Open Models of Decision Support Towards a Framework

    OpenAIRE

    Diasio, Stephen Ray

    2012-01-01

    Aquesta tesi presenta un marc per als models oberts de suport a les decisions en les organitzacions. El treball es vehicula a través d’un compendi d’articles on s’analitzen els fluxos d’entrada i de sortida de coneixement en les organitzacions, així como les tecnologies existents de suport a les decisions. Es presenten els factors subjacents que impulsen nous models per a formes obertes de suport a la decisió. La tesis presenta un estudi de les distintes tipologies de models de suport a les d...

  6. A parametric framework for modelling of bioelectrical signals

    CERN Document Server

    Mughal, Yar Muhammad

    2016-01-01

    This book examines non-invasive, electrical-based methods for disease diagnosis and assessment of heart function. In particular, a formalized signal model is proposed since this offers several advantages over methods that rely on measured data alone. By using a formalized representation, the parameters of the signal model can be easily manipulated and/or modified, thus providing mechanisms that allow researchers to reproduce and control such signals. In addition, having such a formalized signal model makes it possible to develop computer tools that can be used for manipulating and understanding how signal changes result from various heart conditions, as well as for generating input signals for experimenting with and evaluating the performance of e.g. signal extraction methods. The work focuses on bioelectrical information, particularly electrical bio-impedance (EBI). Once the EBI has been measured, the corresponding signals have to be modelled for analysis. This requires a structured approach in order to move...

  7. The Model Vision Project: A Conceptual Framework for Service Delivery

    Science.gov (United States)

    Bourgeault, Stanley E.; And Others

    1977-01-01

    Described are the conceptualization, implementation, and results to date of the George Peabody College for Teachers Model Center for Severely Handicapped Multi-impaired Children with Visual Impairment as a Primary Handicapping Condition. (Author/IM)

  8. Model Adaptation for Prognostics in a Particle Filtering Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated....

  9. A Flexible Framework Hydrological Informatic Modeling System - HIMS

    Science.gov (United States)

    WANG, L.; Wang, Z.; Changming, L.; Li, J.; Bai, P.

    2017-12-01

    Simulating water cycling process temporally and spatially fitting for the characteristics of the study area was important for floods prediction and streamflow simulation with high accuracy, as soil properties, land scape, climate, and land managements were the critical factors influencing the non-linear relationship of rainfall-runoff at watershed scales. Most existing hydrological models cannot simulate water cycle process at different places with customized mechanisms with fixed single structure and mode. This study develops Hydro-Informatic Modeling System (HIMS) model with modular of each critical hydrological process with multiple choices for various scenarios to solve this problem. HIMS has the structure accounting for two runoff generation mechanisms of infiltration excess and saturation excess and estimated runoff with different methods including Time Variance Gain Model (TVGM), LCM which has good performance at ungauged areas, besides the widely used Soil Conservation Service-Curve Number (SCS-CN) method. Channel routing model contains the most widely used Muskingum, and kinematic wave equation with new solving method. HIMS model performance with its symbolic runoff generation model LCM was evaluated through comparison with the observed streamflow datasets of Lasha river watershed at hourly, daily, and monthly time steps. Comparisons between simulational and obervational streamflows were found with NSE higher than 0.87 and WE within ±20%. Water balance analysis about precipitation, streamflow, actual evapotranspiration (ET), and soil moisture change was conducted temporally at annual time step and it has been proved that HIMS model performance was reliable through comparison with literature results at the Lhasa River watershed.

  10. An architectural decision modeling framework for service oriented architecture design

    OpenAIRE

    Zimmermann, Olaf

    2009-01-01

    In this thesis, we investigate whether reusable architectural decision models can support Service-Oriented Architecture (SOA) design. In the current state of the art, architectural decisions are captured ad hoc and retrospectively on projects; this is a labor-intensive undertaking without immediate benefits. On the contrary, we investigate the role reusable architectural decision models can play during SOA design: We treat recurring architectural decisions as first-class method elements and p...

  11. Model Adaptation for Prognostics in a Particle Filtering Framework

    Science.gov (United States)

    Saha, Bhaskar; Goebel, Kai Frank

    2011-01-01

    One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the "curse of dimensionality", i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for "well-designed" particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  12. Model Adaptation for Prognostics in a Particle Filtering Framework

    Directory of Open Access Journals (Sweden)

    Bhaskar Saha

    2011-01-01

    Full Text Available One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the “curse of dimensionality”, i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for “well-designed” particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion and Li-Polymer batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  13. Introducing a boreal wetland model within the Earth System model framework

    Science.gov (United States)

    Getzieh, R. J.; Brovkin, V.; Reick, C.; Kleinen, T.; Raddatz, T.; Raivonen, M.; Sevanto, S.

    2009-04-01

    Wetlands of the northern high latitudes with their low temperatures and waterlogged conditions are prerequisite for peat accumulation. They store at least 25% of the global soil organic carbon and constitute currently the largest natural source of methane. These boreal and subarctic peat carbon pools are sensitive to climate change since the ratio of carbon sequestration and emission is closely dependent on hydrology and temperature. Global biogeochemistry models used for simulations of CO2 dynamics in the past and future climates usually ignore changes in the peat storages. Our approach aims at the evaluation of the boreal wetland feedback to climate through the CO2 and CH4 fluxes on decadal to millennial time scales. A generic model of organic matter accumulation and decay in boreal wetlands is under development in the MPI for Meteorology in cooperation with the University of Helsinki. Our approach is to develop a wetland model which is consistent with the physical and biogeochemical components of the land surface module JSBACH as a part of the Earth System model framework ECHAM5-MPIOM-JSBACH. As prototypes, we use modelling approach by Frolking et al. (2001) for the peat dynamics and the wetland model by Wania (2007) for vegetation cover and plant productivity. An initial distribution of wetlands follows the GLWD-3 map by Lehner and Döll (2004). First results of the modelling approach will be presented. References: Frolking, S. E., N. T. Roulet, T. R. Moore, P. J. H. Richard, M. Lavoie and S. D. Muller (2001): Modeling Northern Peatland Decomposition and Peat Accumulation, Ecosystems, 4, 479-498. Lehner, B., Döll P. (2004): Development and validation of a global database of lakes, reservoirs and wetlands. Journal of Hydrology 296 (1-4), 1-22. Wania, R. (2007): Modelling northern peatland land surface processes, vegetation dynamics and methane emissions. PhD thesis, University of Bristol, 122 pp.

  14. MoVES - A Framework for Modelling and Verifying Embedded Systems

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2009-01-01

    The MoVES framework is being developed to assist in the early phases of embedded systems design. A system is modelled as an application running on an execution platform. The application is modelled through the individual tasks, and the execution platform is modelled through the processing elements...... consumption. A simple specification language for embedded systems and a verification backend are presented. The framework has a modular, parameterized structure supporting easy extension and adaptation of the specification language as well as of the verification backend. We show, using a number of small...... examples, how MoVES can be used to model and analyze embedded systems....

  15. Generalized Gramian Framework for Model/Controller Order Reduction of Switched Systems

    DEFF Research Database (Denmark)

    Shaker, Hamid Reza; Wisniewski, Rafal

    2011-01-01

    In this article, a general method for model/controller order reduction of switched linear dynamical systems is presented. The proposed technique is based on the generalised gramian framework for model reduction. It is shown that different classical reduction methods can be developed into a genera......In this article, a general method for model/controller order reduction of switched linear dynamical systems is presented. The proposed technique is based on the generalised gramian framework for model reduction. It is shown that different classical reduction methods can be developed...

  16. Next Generation Framework for Aquatic Modeling of the Earth System (NextFrAMES)

    Science.gov (United States)

    Fekete, B. M.; Wollheim, W. M.; Lakhankar, T.; Vorosmarty, C. J.

    2008-12-01

    Earth System model development is becoming an increasingly complex task. As scientists attempt to represent the physical and bio-geochemical processes and various feedback mechanisms in unprecedented detail, the models themselves are becoming increasingly complex. At the same time, the surrounding IT infrastructure needed to carry out these detailed model computations is growing increasingly complex as well. To be accurate and useful, Earth System models must manage a vast amount of data in heterogenous computing environments ranging from single CPU systems to Beowulf type computer clusters. Scientists developing Earth System models increasingly confront obstacles associated with IT infrastructure. Numerous development efforts are on the way to ease that burden and offer model development platforms that reduce IT challenges and allow scientists to focus on their science. While these new modeling frameworks (e.g. FMS, ESMF, CCA, OpenMI) do provide solutions to many IT challenges (performing input/output, managing space and time, establishing model coupling, etc.), they are still considerably complex and often have steep learning curves. Over the course of the last fifteen years ,the University of New Hampshire developed several modeling frameworks independently from the above-mentioned efforts (Data Assembler, Frameworks for Aquatic Modeling of the Earth System and NextFrAMES which is continued at CCNY). While the UNH modeling frameworks have numerous similarities to those developed by other teams, these frameworks, in particular the latest NextFrAMES, represent a novel model development paradigm. While other modeling frameworks focus on providing services to modelers to perform various tasks, NextFrAMES strives to hide all of those services and provide a new approach for modelers to express their scientific thoughts. From a scientific perspective, most models have two core elements: the overall model structure (defining the linkages between the simulated processes

  17. A framework for modelling the complexities of food and water security under globalisation

    Science.gov (United States)

    Dermody, Brian J.; Sivapalan, Murugesu; Stehfest, Elke; van Vuuren, Detlef P.; Wassen, Martin J.; Bierkens, Marc F. P.; Dekker, Stefan C.

    2018-01-01

    We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  18. framework for modelling the complexities of food and water security under globalisation

    Directory of Open Access Journals (Sweden)

    B. J. Dermody

    2018-01-01

    Full Text Available We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  19. gamboostLSS: An R Package for Model Building and Variable Selection in the GAMLSS Framework

    OpenAIRE

    Hofner, Benjamin; Mayr, Andreas; Schmid, Matthias

    2014-01-01

    Generalized additive models for location, scale and shape are a flexible class of regression models that allow to model multiple parameters of a distribution function, such as the mean and the standard deviation, simultaneously. With the R package gamboostLSS, we provide a boosting method to fit these models. Variable selection and model choice are naturally available within this regularized regression framework. To introduce and illustrate the R package gamboostLSS and its infrastructure, we...

  20. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, M.L. [California State Univ., Sacramento, CA (United States); Pollock, K.H. [North Carolina State Univ., Raleigh, NC (United States)

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds.

  1. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    International Nuclear Information System (INIS)

    Morrison, M.L.; Pollock, K.H.

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds

  2. Towards a framework for deriving platform-independent model-driven software product lines

    Directory of Open Access Journals (Sweden)

    Andrés Paz

    2013-05-01

    Full Text Available Model-driven software product lines (MD-SPLs are created from domain models which are transformed, merged and composed with reusable core assets, until software products are produced. Model transformation chains (MTCs must be specified to generate such MD-SPLs. This paper presents a framework for creating platform-independent MD-SPLs; such framework includes a domain specific language (DSL for platform-independent MTC specification and facilities platform-specific MTC generation of several of the most used model transformation frameworks. The DSL also allows product line architects to compose generation taking the need for model transformation strategy and technology interoperability into account and specifying several types of variability involved in such generation.

  3. Model continuity in discrete event simulation: A framework for model-driven development of simulation models

    NARCIS (Netherlands)

    Cetinkaya, D; Verbraeck, A.; Seck, MD

    2015-01-01

    Most of the well-known modeling and simulation (M&S) methodologies state the importance of conceptual modeling in simulation studies, and they suggest the use of conceptual models during the simulation model development process. However, only a limited number of methodologies refers to how to

  4. Surgical model-view-controller simulation software framework for local and collaborative applications.

    Science.gov (United States)

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  5. Intrinsic flexibility of porous materials; theory, modelling and the flexibility window of the EMT zeolite framework

    International Nuclear Information System (INIS)

    Fletcher, Rachel E.; Wells, Stephen A.; Leung, Ka Ming; Edwards, Peter P.; Sartbaeva, Asel

    2015-01-01

    Framework materials possess intrinsic flexibility which can be investigated using geometric simulation. We review framework flexibility properties in energy materials and present novel results on the flexibility window of the EMT zeolite framework containing 18-crown-6 ether as a structure directing agent (SDA). Framework materials have structures containing strongly bonded polyhedral groups of atoms connected through their vertices. Typically the energy cost for variations of the inter-polyhedral geometry is much less than the cost of distortions of the polyhedra themselves – as in the case of silicates, where the geometry of the SiO 4 tetrahedral group is much more strongly constrained than the Si—O—Si bridging angle. As a result, framework materials frequently display intrinsic flexibility, and their dynamic and static properties are strongly influenced by low-energy collective motions of the polyhedra. Insight into these motions can be obtained in reciprocal space through the ‘rigid unit mode’ (RUM) model, and in real-space through template-based geometric simulations. We briefly review the framework flexibility phenomena in energy-relevant materials, including ionic conductors, perovskites and zeolites. In particular we examine the ‘flexibility window’ phenomenon in zeolites and present novel results on the flexibility window of the EMT framework, which shed light on the role of structure-directing agents. Our key finding is that the crown ether, despite its steric bulk, does not limit the geometric flexibility of the framework

  6. A CONCEPTUAL FRAMEWORK FOR SUSTAINABLE POULTRY SUPPLY CHAIN MODEL

    Directory of Open Access Journals (Sweden)

    Mohammad SHAMSUDDOHA

    2013-12-01

    Full Text Available Now a day, sustainable supply chain is the crucially considerable matter for future focused industries. As a result, attention in supply chain management has increasingly amplified since the 1980s when firms discovered its benefits of mutual relationships within and beyond their own organization. This is why, concern researchers are trying hard to develop new theory or model which might help the corporate sector for achieving sustainability in their supply chains. This kind of reflection can be seen by the number of papers published and in particular by journal since 1980. The objectives of this paper are twofold. First, it offers a literature review on sustainable supply chain management taking papers published in last three decades. Second, it offers a conceptual sustainable supply chain process model in light of triple bottom line theory. The model has been developed by taking in-depth interview of an entrepreneur from a Poultry case industry in Bangladesh.

  7. Magnetically charged black hole in framework of nonlinear electrodynamics model

    Science.gov (United States)

    Kruglov, S. I.

    2018-01-01

    A model of nonlinear electrodynamics is proposed and investigated in general relativity. We consider the magnetic black hole and find a regular solution which gives corrections into the Reissner-Nordström solution. At r →∞ the asymptotic space-time becomes flat. The magnetic mass of the black hole is calculated and the metric function is obtained. At some values of the model parameter there can be one, two or no horizons. Thermodynamics of black holes is studied and we calculate the Hawking temperature and heat capacity of black holes. It is demonstrated that there is a phase transition of second order. At some parameters of the model black holes are thermodynamically stable.

  8. Stochastic programming framework for Lithuanian pension payout modelling

    Directory of Open Access Journals (Sweden)

    Audrius Kabašinskas

    2014-12-01

    Full Text Available The paper provides a scientific approach to the problem of selecting a pension fund by taking into account some specific characteristics of the Lithuanian Republic (LR pension accumulation system. The decision making model, which can be used to plan a long-term pension accrual of the Lithuanian Republic (LR citizens, in an optimal way is presented. This model focuses on factors that influence the sustainability of the pension system selection under macroeconomic, social and demographic uncertainty. The model is formalized as a single stage stochastic optimization problem where the long-term optimal strategy can be obtained based on the possible scenarios generated for a particular participant. Stochastic programming methods allow including the pension fund rebalancing moment and direction of investment, and taking into account possible changes of personal income, changes of society and the global financial market. The collection of methods used to generate scenario trees was found useful to solve strategic planning problems.

  9. A Flexible Atmospheric Modeling Framework for the CESM

    Energy Technology Data Exchange (ETDEWEB)

    Randall, David [Colorado State University; Heikes, Ross [Colorado State University; Konor, Celal [Colorado State University

    2014-11-12

    We have created two global dynamical cores based on the unified system of equations and Z-grid staggering on an icosahedral grid, which are collectively called UZIM (Unified Z-grid Icosahedral Model). The z-coordinate version (UZIM-height) can be run in hydrostatic and nonhydrostatic modes. The sigma-coordinate version (UZIM-sigma) runs in only hydrostatic mode. The super-parameterization has been included as a physics option in both models. The UZIM versions with the super-parameterization are called SUZI. With SUZI-height, we have completed aquaplanet runs. With SUZI-sigma, we are making aquaplanet runs and realistic climate simulations. SUZI-sigma includes realistic topography and a SiB3 model to parameterize the land-surface processes.

  10. Development of an integrated risk assessment framework for internal/external events and all power models

    International Nuclear Information System (INIS)

    Yang, Joon Eon

    2012-01-01

    From the PSA point of view, the Fukushima accident of Japan in 2011 reveals some issues to be re-considered and/or improved in the PSA such as the limited scope of the PSA, site risk, etc. KAERI (Korea Atomic Energy Research Institute) has performed researches on the development of an integrated risk assessment framework related to some issues arisen after the Fukushima accident. This framework can cover the internal PSA model and external PSA models (fire, flooding, and seismic PSA models) in the full power and the low power-shutdown modes. This framework also integrates level 1, 2 and 3 PSA to quantify the risk of nuclear facilities more efficiently and consistently. We expect that this framework will be helpful to resolve the issue regarding the limited scope of PSA and to reduce some inconsistencies that might exist between (1) the internal and external PSA, and (2) full power mode PSA and low power-shutdown PSA models. In addition, KAERI is starting researches related to the extreme external events, the risk assessment of spent fuel pool, and the site risk. These emerging issues will be incorporated into the integrated risk assessment framework. In this paper the integrated risk assessment framework and the research activities on the emerging issues are outlined.

  11. A GBT-framework towards modal modelling of steel structures

    DEFF Research Database (Denmark)

    Hansen, Anders Bau; Jönsson, Jeppe

    2017-01-01

    the rotational stiffness of a connection. Based on a modelling of any beam-to-column joint using finite shell elements and springs for single components such as bolts, it is the primary hypothesis that it is possible to formulate a generalized connection model with few degrees of freedom related to a relevant...... set of deformation modes. This hypothesis is based on the idea of modal decomposition performed in generalized beam theories (GBT). The question is – is it possible to formulate an eigenvalue problem with a solution corresponding to mode shapes for the deformation of the joint by using the finite...

  12. Implementation of a PETN failure model using ARIA's general chemistry framework

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model, implementation, and validation.

  13. A modeling framework for life history-based conservation planning

    Science.gov (United States)

    Eileen S. Burns; Sandor F. Toth; Robert G. Haight

    2013-01-01

    Reserve site selection models can be enhanced by including habitat conditions that populations need for food, shelter, and reproduction. We present a new population protection function that determines whether minimum areas of land with desired habitat features are present within the desired spatial conditions in the protected sites. Embedding the protection function as...

  14. A conceptual framework for a mentoring model for nurse educators ...

    African Journals Online (AJOL)

    Transformation in South Africa resulted in changes in the mandate of Higher Education Institutions (HEIs). Therefore, the need to design a mentoring model for recruiting and retaining nurse educators to meet the demands of teaching and learning became evident. The aim of the study was to develop a conceptual ...

  15. Development of a distributed air pollutant dry deposition modeling framework

    Science.gov (United States)

    Satoshi Hirabayashi; Charles N. Kroll; David J. Nowak

    2012-01-01

    A distributed air pollutant dry deposition modeling systemwas developed with a geographic information system (GIS) to enhance the functionality of i-Tree Eco (i-Tree, 2011). With the developed system, temperature, leaf area index (LAI) and air pollutant concentration in a spatially distributed form can be estimated, and based on these and other input variables, dry...

  16. A Framework for Modelling Connective Tissue Changes in VIIP Syndrome

    Science.gov (United States)

    Ethier, C. R.; Best, L.; Gleason, R.; Mulugeta, L.; Myers, J. G.; Nelson, E. S.; Samuels, B. C.

    2014-01-01

    Insertion of astronauts into microgravity induces a cascade of physiological adaptations, notably including a cephalad fluid shift. Longer-duration flights carry an increased risk of developing Visual Impairment and Intracranial Pressure (VIIP) syndrome, a spectrum of ophthalmic changes including posterior globe flattening, choroidal folds, distension of the optic nerve sheath, kinking of the optic nerve and potentially permanent degradation of visual function. The slow onset of changes in VIIP, their chronic nature, and the similarity of certain clinical features of VIIP to ophthalmic findings in patients with raised intracranial pressure strongly suggest that: (i) biomechanical factors play a role in VIIP, and (ii) connective tissue remodeling must be accounted for if we wish to understand the pathology of VIIP. Our goal is to elucidate the pathophysiology of VIIP and suggest countermeasures based on biomechanical modeling of ocular tissues, suitably informed by experimental data, and followed by validation and verification. We specifically seek to understand the quasi-homeostatic state that evolves over weeks to months in space, during which ocular tissue remodeling occurs. This effort is informed by three bodies of work: (i) modeling of cephalad fluid shifts; (ii) modeling of ophthalmic tissue biomechanics in glaucoma; and (iii) modeling of connective tissue changes in response to biomechanical loading.

  17. A Framework for the Modelling of Biphasic Reacting Systems

    DEFF Research Database (Denmark)

    Anantpinijwatna, Amata; Sin, Gürkan; O’Connell, John P.

    2014-01-01

    Biphasic reacting systems have a broad application range from organic reactions in pharmaceutical and agro-bio industries to CO 2 capture. However, mathematical modelling of biphasic reacting systems is a formidable challenge due to many phenomena underlying the process such as chemical equilibrium...

  18. Multi-Fidelity Framework for Modeling Combustion Instability

    Science.gov (United States)

    2016-07-27

    Modeling Combustion Instability Cheng Huang*, William E. Anderson†, Charles L. Merkle‡ Purdue University, West Lafayette, IN, 47907 and...with density fluctuations," Physics of Fluids Vol. 9, No. 7, 1997, p. 2023. 11. Graham, W. R., Peraire, J., and Tang , K. Y. "Optmail Control of

  19. Technical note: River modelling to infer flood management framework

    African Journals Online (AJOL)

    River hydraulic models have successfully identified the weaknesses and areas for improvement with respect to flooding in the Sarawak River system, and can also be used to support decisions on flood management measures. Often, the big question is 'how'. This paper demonstrates a theoretical flood management ...

  20. A GBT-framework towards modal modelling of steel structures

    DEFF Research Database (Denmark)

    Hansen, Anders Bau; Jönsson, Jeppe

    2017-01-01

    In modern structural steel frame design, the modelling of joints between beams and columns are based on very simple assumptions. The joints are most often assumed to behave as a perfect hinge or as a rigid joint. This means that in the overall static analysis relative rotations and changes...